bryanW1995
Lifer
- May 22, 2007
- 11,144
- 32
- 91
No, they aren't.
Hey, do you know if they plan to have a dual bios setup perhaps instead? maybe one for 300w or so and one for 450w? With that sort of setup they could steal AMD's thunder easily.
No, they aren't.
I've never owned one. I'm basing my opinion off other people's experiences with warranty refusals
Well if it runs well below the specs of a full-bledged 580, the fan may not need to spin as hard.
It's one thing to have a louder card, but 6990 is the loudest card ever made outside of the FX5800U. I still think enthusiasts care about noise to some extent. This is why cards with superior coolers like the Asus DirectCU II series of cards are popular among enthusiasts for example. Also, hardcore overclockers who benchmark will likely put this card under water anyway and run 2 of them. What about everyone else? Use earplugs?
NV wouldn't have listened to their customers' complaints about how loud the GTX480 was when they redesigned the heatsinks on the 570/580 series if enthusiasts didn't care about noise. Specifically, lower noise was one of the features in their GTX580 slides.
It's basically how nVidia dealt with Eyefinity, they don't have a real answer so they scrap together what they can and try to get a "tick" marker under the feature list. As for CUDA, not everyone benefits from this but those who do certainly appreciate it and finally PhysX, quite possibly the biggest F' Up nVidia has done. It angers me because nVidia could have done so much more with it yet they completely mis-handled the situation and now it's a cool effect added on to games rather than a epic game enhancing feature.
Have you guys seen the BF3 episode 2? Those physics are awesome with the debri falling all over and I have no doubts that we will see destructable environments like in BC2. That's what PhysX should be doing not making my explosion look cooler.
http://www.tomshardware.com/reviews/radeon-hd-6870-radeon-hd-6850-barts,2776-3.html
"First, there are no dedicated 3D monitors in North America that employ the HDMI 1.4 standard...our experience has shown us that Nvidias 3D Vision driver works better and more consistently in the majority of games...you can either game in stereo at 720p maxing out at 60 frames per second per eye, or you can game at 1080p with up to 24 frames per second per eye
"
Safe to say that Nvidias 3D remains in the over and above feature list you're trying to wittle down.![]()
AMD's 3D technology for its Radeon HD products conform to industry standards most notably right now is HDMI 1.4a. Sadly, this doesn't include a lot of desktop monitors on the market today, but it does include essentially every new 3D HDTV in existence today.
not sure if you've ever tried 3d OR eyefinity (I haven't), but from what I can tell I think that the features fall into the same category for the vast majority of users: totally cool-looking and not worth the cost atm. CUDA is great if you need it and useless otherwise, physx is still more fluff than substance, but 3d, like eyefinity, brings immediate and substantial improvement to your gaming experience RIGHT NOW.
From the numbers I have seen, the GTX580 can really start sucking the wattage as you overclock and add voltage. Two 8 pin connectors and a PCIE slot are rated for 375 watts. I'm sure it's doable with a good power supply, but you would be pushing some things pretty far beyond spec.
It's basically how nVidia dealt with Eyefinity, they don't have a real answer so they scrap together what they can and try to get a "tick" marker under the feature list. As for CUDA, not everyone benefits from this but those who do certainly appreciate it and finally PhysX, quite possibly the biggest F' Up nVidia has done. It angers me because nVidia could have done so much more with it yet they completely mis-handled the situation and now it's a cool effect added on to games rather than a epic game enhancing feature.
Have you guys seen the BF3 episode 2? Those physics are awesome with the debri falling all over and I have no doubts that we will see destructable environments like in BC2. That's what PhysX should be doing not making my explosion look cooler.
Typical AT response "Yeah man this card teh suxxors fo real" as they use a 4 year old 8800gt.
Not my cup of tea but this or the 6990 would wax my single 580.![]()
Core Clock: 752 MHz Memory Clock: 1002 MHz
Memory Bus: 384 bit Memory Size: 3072 MB
Memory Type: GDDR5 Interface: PCI-E ?
ROPs: 96 Released: Mar 2011
Shading Units: 1024
No clue where this came from, but:
http://www.techpowerup.com/gpudb/281/.html
Code:Core Clock: 752 MHz Memory Clock: 1002 MHz Memory Bus: 384 bit Memory Size: 3072 MB Memory Type: GDDR5 Interface: PCI-E ? ROPs: 96 Released: Mar 2011 Shading Units: 1024
No clue where this came from, but:
http://www.techpowerup.com/gpudb/281/.html
Code:Core Clock: 752 MHz Memory Clock: 1002 MHz Memory Bus: 384 bit Memory Size: 3072 MB Memory Type: GDDR5 Interface: PCI-E ? ROPs: 96 Released: Mar 2011 Shading Units: 1024
Speaking of wax if your HDD is behind a 6990 it may melt like a candle on a hot summer evening.![]()
Should be no more likely to wax a HDD than an equivalent system with xfire 6970's or 6950's though, right?
I mean it's not like the 6990 takes system-level power consumptions to new highs, it just takes it to the same old previous highs while populating fewer PCIe slots if I understand correctly.
Speaking of wax if your HDD is behind a 6990 it may melt like a candle on a hot summer evening.
Coolers on both these dual GPU cards are bad news for inside case temps. I'm sure sales of waterblocks for both will be strong.
Making 590s single slot by watercooling them is a unique idea. Seven of them in a dual Xeon (SR-2) system would definitely make a nice render/crunching machine. 14 GPUs + 24 logical CPUs. Overclock the GPUs to 900+ and the CPUs to 4.5+ (all under water obviously) and what kind of PPD would that be good for?You may have to hang your clothes out to dry and put your computer in the laundry room so it could plug into that 230V 30A outlet though.
![]()
I think Rubycon is referring to a review where they said that the 6990 raised the temp of their hdd by about 10c. I don't recall which review it was or the exact circumstances. In all likelihood just about any high end card that doesn't exhaust all of it's heat outside the case would do the same. They just don't typically measure that now, do they? Which, I believe makes your point.
I think it was the AT review, and it's more to do with the heat being exhausted out of both ends of the card, so half the heat goes out of the back of the case, the other half heads to the front, where HDD bays and HDDs are usually located, so your HDD gets blasted with the hot air from half a 6990, nothing to do with overall case temps, just adjacent HDDs.
Should be no more likely to wax a HDD than an equivalent system with xfire 6970's or 6950's though, right?
I mean it's not like the 6990 takes system-level power consumptions to new highs, it just takes it to the same old previous highs while populating fewer PCIe slots if I understand correctly.
We're well into page 4, is it safe to start using bad car analogies yet? ()
Buying 6990 or 590 class video cards and going cheap on cooling is like buying a sports car and trying to get away with $50 tires.
