Powercolor HD 7990 Devil 13 6 GB Review

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
If you have a very solid cooler, you can drastically lower the noise levels. The 6990 is not the loudest card of all time, FX5800 Ultra is. Not sure how in the world you arrived at 7990 being the 2nd loudest card of all time. The noise levels in that video are nowhere near HD4890 or GTX480. Regardless, I wouldn't buy a 690 or 7990. They are both terrible values. The 690 gets nailed hard by HD7970 GE cross-fire for high resolution gaming and multi-monitor; so to begin with $1k for that is a non-starter. Also, not sure where you are getting the idea that 690 is quiet. It's not quiet at all. It's louder than the HD6970 and is as loud as a reference 7970. That's not quiet by any means. Something like Asus DirectCUII 670s in SLI or HD7970 Dual-X/Vapor-X would walk all over the 690/7990 in noise levels. The noise level for 690 is unacceptable for a $1k GPU (and the same is true for a 7990 with that fan curve).

46208.png


This is quiet to me, or this. GTX690 is way too loud for me now. Now that I've tried an after-market card, I've become much more conscious of quieter noise levels. GTX690's noise levels are too high for the price. I expect better than that for $1k.
 
Last edited:

MagnusTheBrewer

IN MEMORIAM
Jun 19, 2004
24,122
1,594
126
Ya, that's a good point. You can set up a custom fan curve on the 7990 card and let the temperatures rise to 77-80*C to reduce the noise levels.

Personally, I think the whole, "it's gotta be quiet" crowd is a tad over the top. I have no problem enjoying a system even though I can hear fans. Hell, it's summertime and the A/C plus fans are running all over the house. It doesn't make any difference to me if the fans in my case are audible. If I want to block it out, I'll wear headphones.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
WOW!

So now we have 7990 Devil actually producing LESS NOISE than GTX 690??

That is some pretty wild stuff right there.
Particularly if you keep in mind that we're talking about likely the 2nd most loudest card ever built (1st being 6990),

vs GTX 690 which is arguably the most silent dualie in GPU history.

And keep in mind that unlike 690, EVERY single Watt that Devil spits out adds to the heat inside the case.

I never predicted LESS NOISE than GTX 690??. Just that it's running substantially cooler. If they were adjusted to run at the same temps the noise generated would change. Slow the fans down on the 7990 and it'll be quieter. Speed up the fan on the 690 it's going to be louder. No WOW!, it's quite simple to understand. Give up on dramatizing it to make it look like I said something I didn't. :rolleyes:
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
@3DVagabond I never said you, or even RS said that. But that shocking conclusion was pretty straightforward following his reasoning. Forget about it...

6901skutb.png


That's 7dB difference between the two cards as measured by TPU.
A bit of physics... 7dB diff. means that you can move your case with Devil in it at double distance, and Devil will still be louder!

Anyone wanna guess how many GTX 690 you need to stack together in a case to get 7dB increase :sneaky:

@RS I confused 7970 GHz with Devil, that's why I gave it 2nd spot behind 6990 :)
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Little bit? 7dB is close to twice perceived loudness (not an exact physical quantity so not an exact number)

And the answer is FIVE :)
You need 5 identical sources to raise SPL by 7dB
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Little bit? 7dB is close to twice perceived loudness (not an exact physical quantity so not an exact number)

And the answer is FIVE :)
You need 5 identical sources to raise SPL by 7dB

I don't know how that is relevant to my pointing out the difference in temps. Or to RS agreeing that it's relevant.

Do you know how much louder the 690 would be if the fan was cranked up to make it say ~10*c cooler under load? I don't, and I doubt you do either. That would be the comparison though that would be relevant. How much further or closer or how many sound sources it would take isn't. Although it's a really dramatic way to point out the value of 7db. Gotta hand it to you for your dramatizations. Very impressive. :thumbsup:

P.S. I realize that 7db is significant. I know that 10db is perceived as twice as loud, for example. ;)
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Perceived loudness is an empiric, psycho-acoustic quantity. Not a real physical quantity.
6-10dB is usually quoted.

Bottom line - These two cards are not even comparable in noise department.
Not to mention what happens when switching to Turbo.
Dunno why we're arguing here really. Oh yeah I know... RS concluding they have similar noise level, based on... VIDEOS :)

There is a reason why AMD postponed 7990 and finally gave up.
It's a is pretty obvious one.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Perceived loudness is an empiric, psycho-acoustic quantity. Not a real physical quantity.
6-10dB is usually quoted.

Bottom line - These two cards are not even comparable in noise department.
Not to mention what happens when switching to Turbo.
Dunno why we're arguing here really. Oh yeah I know... RS concluding they have similar noise level, based on... VIDEOS :)

There is a reason why AMD postponed 7990 and finally gave up.
It's a is pretty obvious one.

So you don't believe that the 690 would be appreciably louder if the fan was cranked up enough to lower the GPU temps to the same as the 7990?
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Another review: HardwareLuxx. It's in German, Google Translate version for the German impared. ;)

No major scaling issues in this review. Even Skyrim scales well. Not so good for Crysis 2 though, but still improves over one card. Wins some and loses some to the 690.

Just for the sake of the noise discussion:
mess6.jpg


Temps are still appreciably lower:
mess4.jpg
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
As it was posted here in the Fermi era, it's still the amount of watts the card is pulling which results in the amount of heat dumped in to your room.

We knew that building a dual Tahiti board would be difficult, but how difficult it is becomes clear on this page. In Furmark PowerColor's Devil 13 consumes 551 Watts of power! A new record. During typical gaming, power consumption is very high as well, hovering around the 300 W mark. NVIDIA's GTX 690 does much better here.

We see some higher power consumption in non-gaming states as well, which is surprising because ULPS should put the second GPU to sleep outside of gaming states

7990 is a 3, 8-pin card. As far as watts pulled during gaming, you almost need to see a graph like fps, the peak might not tell the whole story. Similar to how Furmark does not tell the whole story.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
As it was posted here in the Fermi era, it's still the amount of watts the card is pulling which results in the amount of heat dumped in to your room.



7990 is a 3, 8-pin card. As far as watts pulled during gaming, you almost need to see a graph like fps, the peak might not tell the whole story. Similar to how Furmark does not tell the whole story.

No arguing that it draws a ton of power. IMO that's why we haven't seen a reference version from AMD. Hopefully that can be addressed some with the 8000 series.

I'm actually surprised it's "only" 300w in games. One good thing the incredible Furmark number shows... no throttling. Just puts the pedal down.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
@3DVagabond That noise chart you posted...

  • Notice new silent-PC champion 7970 GHz :eek:
  • 680 SLI similar to 6990 :eek:

How about raising standards a bit?
For instance, I refuse to post just every piece of garbage I can find on internet. No matter how well it supports my claims.
While it's true that noise charts tend to be all over the place, the one you posted really takes the cake.

So you don't believe that the 690 would be appreciably louder if the fan was cranked up enough to lower the GPU temps to the same as the 7990?

Of course it would. But why go and shoot Nvidia, just because competition can't seem to make a decent cooler/fan profile. Would it be possible to change Devil's profile to shave off some noise? No doubt
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Triple-slot card. The wider the card, the larger you can make the heatsinks and fans, the quieter you can make the card overall. This is the same reason Asus has been favoring that design for their high-end DirectCU II cards.

Hmm come to think of it, maybe TPU got it wrong, because their sample was borked. A massive cooler, tripple fan combo should be able to get rid of the heat much more graciously than recorded.

Update: PowerColor just told us that they are working on the mounting issue and halted all shipments of production boards to investigate. No retail cards should be affected
http://www.techpowerup.com/reviews/Powercolor/HD_7990_Devil_13/32.html

So it seems it's PowerColor that got it wrong :rolleyes:
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
@3DVagabond That noise chart you posted...

  • Notice new silent-PC champion 7970 GHz :eek:
  • 680 SLI similar to 6990 :eek:

How about raising standards a bit?
For instance, I refuse to post just every piece of garbage I can find on internet. No matter how well it supports my claims.
While it's true that noise charts tend to be all over the place, the one you posted really takes the cake.



Of course it would. But why go and shoot Nvidia, just because competition can't seem to make a decent cooler/fan profile. Would it be possible to change Devil's profile to shave off some noise? No doubt

I posted it because it's another review of the 7990 and this is a 7990 review thread. Sorry it doesn't make your point. I'm not shooting nVidia in any way. I've never once stated they've done anything wrong or need to change/fix/adjust anything.

It does show though that until there is some sort of standard measuring procedure all posted noise figures are worthless when compared to each other. So people should refrain from dismissing a product simply because of a site's personal test procedure.

Nice to see that you've either finally understood, or admitted to my point that the difference in measured temps can be directly correlated to the difference in noise.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
@RS I confused 7970 GHz with Devil, that's why I gave it 2nd spot behind 6990 :)

np :)

f1sherman, just to make the point, I don't disagree that 7990 Devil13 is loud, it certainly is, which makes it an even harder sell. It may be possible to lower the noise levels but at higher temperatures, there would be little room for overclocking, which is 7970's forte. Still not seeing much point to this card for gamers, although it'll sell out to bitcoin miners.

Oh yeah I know... RS concluding they have similar noise level, based on... VIDEOS :)

Sorry, I never said that. GTX690 sounds quieter to me from those videos, but both cards are too loud given their price tags imo. What 3DVagabond is saying is that you can reduce the noise level of the 7990 by setting up a custom fan curve, which is a fair point. What I am saying is even if you do that, you are going to reduce the ability to OC the 7990, which makes it worse vs. 7970 GE Cross-fire setup. I suppose a card like this is for people with smaller cases where 2 cards won't fit. Then again it's triple slot, not exactly small.

@ notty22, yes this card consumes more power than the 690, which was expected. The 551W of power though is irrelevant since we don't play Furmark :) Around 300W of power is not that much more than a 690 in actual games.
 
Last edited:

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
We don't play Bitcoin either and GOD knows we hear enough commentary about it. Especially the absolute ridiculous , it makes cards free b/s. You have stated miners are going to buy this 1000 dollar card to play bitcoin, at probably a 500 watt power pull. Might as well justify a new car purchase, so you can deliver pizza, or have your kid do it. To put towards your car payments, making the car(d) free !
No sarcasm above :)
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
We don't play Bitcoin either and GOD knows we hear enough commentary about it. Especially the absolute ridiculous , it makes cards free b/s. You have stated miners are going to buy this 1000 dollar card to play bitcoin, at probably a 500 watt power pull. Might as well justify a new car purchase, so you can deliver pizza, or have your kid do it. To put towards your car payments, making the car(d) free !
No sarcasm above :)

lol, 500w. Try more real world usage, and less Furmark.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
We don't play Bitcoin either and GOD knows we hear enough commentary about it. Especially the absolute ridiculous , it makes cards free b/s. You have stated miners are going to buy this 1000 dollar card to play bitcoin, at probably a 500 watt power pull. Might as well justify a new car purchase, so you can deliver pizza, or have your kid do it. To put towards your car payments, making the car(d) free !
No sarcasm above :)

Still not understanding why you keep insisting that this card uses 500-550W of power. I am seeing a 30W difference between the 7990 and 690 in games.

Bitcoin makes actual $ though after electricity costs, Furmark doesn't. Furmark isn't a game engine either but an unrealistic heat virus. Not sure how using Furmark is relevant to measure power consumption. We've been over this on our forum and most people here agree Furmark is not realistic. I know you don't believe that bitcoin is real and it makes AMD cards free over time (assuming you pay cheap electricity costs), but it does. :cool: Also, because bitcoin doesn't need memory speed, you can get below 200W of power on a single 7970 @ 1150mhz running it, not some mythical 300W of power use. You could have gotten a 7970 and bitcoin mined, converted those bitcoins to Amazon/Newegg gift cards, sold the 7970 and bought yourself a $600 GTX780 next generation. Instead of starting that 9 months ago, you ignored it and now you'll have to pay $600 out of your pocket if you want that 780 card. Like I said for gamers this card isn't great but the crowd who will buy this card knows exactly why they are buying it.
 
Last edited:

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
Still not understanding why you keep insisting that this card uses 500-550W of power. I am seeing a 30W difference between the 7990 and 690 in games.

Bitcoin makes actual $ though after electricity costs, Furmark doesn't. Furmark isn't a game engine either but an unrealistic heat virus. Not sure how using Furmark is relevant to measure power consumption. We've been over this on our forum and most people here agree Furmark is not realistic. I know you don't believe that bitcoin is real and it makes AMD cards free over time, but it does. :cool: Also, because bitcoin doesn't need memory speed, you can get below 200W of power on a single 7970 @ 1150mhz running it, not some mythical 300W of power use.

Exactly. My 7970 at 1200/685 uses about 165w, while slowly contributing to my beer fund. I don't understand why the Nvidia crowd loves to bash mining D:
 

Elfear

Diamond Member
May 30, 2004
7,169
829
126
We don't play Bitcoin either and GOD knows we hear enough commentary about it. Especially the absolute ridiculous , it makes cards free b/s. You have stated miners are going to buy this 1000 dollar card to play bitcoin, at probably a 500 watt power pull. Might as well justify a new car purchase, so you can deliver pizza, or have your kid do it. To put towards your car payments, making the car(d) free !
No sarcasm above :)

o_O

Many Bitcoin miners have made enough money to pay for the cost of their video cards (after the cost of electricity). It really isn't rocket science and I'm not sure where your post came from.

Also I doubt one 7990 is going to pull 500W by itself. The whole system might draw ~500W even with overclocking taken into account. My system with one overclocked 7970 pulls 280W total and with three overclocked cards it goes up to 755W.

If you're in the U.S., the cost of electricity is ~.12/kW. If the system draws 500W of power, that's only $1.44/day in electricity costs but you make $5/day in bitcoins (.5BTC generation per day with $10 exchange rate).

Do you think miners are lying about their profitability?

Edit - Just dropped my memory clocks from 685Mhz to 170Mhz and total power draw dropped to ~700W. Hash rate only dropped 30MH/s.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It's pretty amazing how despite everything that's been discussed regarding HD7950/7970 power consumption, people continue to use reference designed HD7950/7970 cards as a measure of the power consumption for after-market 7950/7970 cards that enthusiasts actually buy. Gamers aren't carelessly pumping 1.25V into their after-market 7950/7970 cards to get 925/1050mhz GPU clocks.

The HD7970 1050mhz = as bad as GTX480 power myth continues. The after-market 7970 1050mhz cards use less than a 580.
power_peak.gif
 

lopri

Elite Member
Jul 27, 2002
13,329
709
126
This is more like "technology showcase" like ASUS Mars, I take? 10 cards available worldwide, individual card comes with gold-plated certificate of ownership.
 

Jeff007245

Member
Aug 31, 2007
125
1
81
Been checking e-tailers for availability of this card. What's taking so long... why isn't it available for purchase yet?