videocardzFirst AMD Radeon R9 290X 1080p performance review

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
The thing that concerns me is the GPU temps and power draw, at least compared to the 780 and Titan. It ran 12C hotter than Titan and ~160 Mhz higher.

The Titan Ultra should have no issues beating it based on that.

You aren't seriously talking Furmark usage? Let me show you something.

power_average.gif

This is avg power usage in Crysis 2

power_peak.gif

This is peak power usage in Crysis 2 (The highest single amount in one spot)

power_maximum.gif

This is Furmark. 43% higher than the peak power usage during any part of a well optimized (ie likely to run +90% CPU usage) game. Anyone pointing at the 290X power usage in Furmark as representative of real world power usage either doesn't understand how the benchmark is handled by AMD or is being disingenuous.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
We'll see real figures soon enough when AT, H, TPU, Hardware.fr and Computerbase.de do proper reviews.

Reference cooler is just sad for overclocking though. That means we'll have to folk out extra for after market or water cooling etc.

The R290 (not X) result is interesting, very close in performance, and if they let AIBs do custom cooling, should be the card to get in terms of perf/$.

Considering that AMD just put basically the same cooler on it as the 7970, I don't think there's going to be any lockout on custom coolers, as has been suggested. I believe that there just aren't enough chips at the moment to supply the reference designs as well as more for custom designs.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Last time I checked, Crysis, which I mentioned as well, was a game. Absolute consumption numbers produced by Furmark are of little real world use. However, what are the odds that a 290x is 18% worse than a Titan in Furmark, and then uses less power, or is even roughly equivalent to a Titan in games? Pretty darn slim.

Actually, if you look at the charts I posted above, you'll see if it follows similarly to Tahiti, it's actually pretty good odds. The 780 uses slightly more power than the 280X in games and way less in Furmark.
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
This is Furmark. 43% higher than the peak power usage during any part of a well optimized (ie likely to run +90% CPU usage) game. Anyone pointing at the 290X power usage in Furmark as representative of real world power usage either doesn't understand how the benchmark is handled by AMD or is being disingenuous.

Here we go again. Same song, second verse. You are looking at factory OVERCLOCKED results which have no relevance to reference performance. That's like comparing fuel mileage of 2 cars capable of 200MPH running at top speed, one electronically limited to 150MPH, one with no limiter and then concluding that the test was bogus because one car got far better gas mileage while ignoring that it was moving 50MPH slower. It's not the test that is broken, it's your understanding of the test that is broken.

Any benchmark is worthless if you don't understand what it is testing and how different settings affect the results.

Anandtech compared Metro:LL to Furmark and found the reference 280x used 24% more power running Furmark. While a highly OC'd Sapphire model with a higher powertune limit and increased voltage used 38% more power. When you remove or raise limits, worst case scenario numbers will sometimes get very skewed.

Actually, if you look at the charts I posted above, you'll see if it follows similarly to Tahiti, it's actually pretty good odds. The 780 uses slightly more power than the 280X in games and way less in Furmark.


Not according to Anand. Again using a reference 280x. A 780 used 38 more Watts in Metro and 5 more Watts in Furmark. 33W difference is pretty irrelevant.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Here we go again. Same song, second verse. You are looking at factory OVERCLOCKED results which have no relevance to reference performance. That's like comparing fuel mileage of 2 cars capable of 200MPH running at top speed, one electronically limited to 150MPH, one with no limiter and then concluding that the test was bogus because one car got far better gas mileage while ignoring that it was moving 50MPH slower. It's not the test that is broken, it's your understanding of the test that is broken.

Any benchmark is worthless if you don't understand what it is testing and how different settings affect the results.

Anandtech compared Metro:LL to Furmark and found the reference 280x used 24% more power running Furmark. While a highly OC'd Sapphire model with a higher powertune limit and increased voltage used 38% more power. When you remove or raise limits, worst case scenario numbers will sometimes get very skewed.






Not according to Anand. Again using a reference 280x. A 780 used 38 more Watts in Metro and 5 more Watts in Furmark. 33W difference is pretty irrelevant.

Mate, you are all over the place. I can't tell if you are agreeing or disagreeing with me. Looking at the bold section you agree with me?
 

dacostafilipe

Senior member
Oct 10, 2013
810
315
136
Here we go again. Same song, second verse. You are looking at factory OVERCLOCKED results which have no relevance to reference performance. That's like comparing fuel mileage of 2 cars capable of 200MPH running at top speed, one electronically limited to 150MPH, one with no limiter and then concluding that the test was bogus because one car got far better gas mileage while ignoring that it was moving 50MPH slower. It's not the test that is broken, it's your understanding of the test that is broken.

Any benchmark is worthless if you don't understand what it is testing and how different settings affect the results.

Anandtech compared Metro:LL to Furmark and found the reference 280x used 24% more power running Furmark. While a highly OC'd Sapphire model with a higher powertune limit and increased voltage used 38% more power. When you remove or raise limits, worst case scenario numbers will sometimes get very skewed.

Not according to Anand. Again using a reference 280x. A 780 used 38 more Watts in Metro and 5 more Watts in Furmark. 33W difference is pretty irrelevant.

But, but ... you say the exact same thing he says ... or did I miss something? o_O
 

ICDP

Senior member
Nov 15, 2012
707
0
0
Furmark is not even remotely indicative of actual thermal results from gaming. That test/review was from an R9 290X set at 40% fan speed. Is anyone going to suggest 40% fan speed in a Furmark test gives an accurate indication of the thermal cooling solution on a GPU?

Here is a review of an Asus DCII R9 280X hitting 90c in Furmark with only 40% (lowest) fan speed at which point the reviewer stopped the test. 50% fan speed resulted in a much lower and temperature of 71c.
http://www.ocaholic.ch/modules/smartsection/item.php?itemid=1110&page=19

Nvidia cards are no better under those circumstances. Here is a screenshot of my GTX780 with a fan speed of 40% hitting 95c after 2 minutes 23 seconds. This is an MSI TF Gaming GTX780 that normally runs at 1045 boost clock, note how throttling has dropped it to 914 Core clock. The temperature reached 95c and was still rising when I stopped the test before my GPU caught fire, as would any sane person.



I suppose I should tell people to avoid one of the better custom cooled GTX780s because it gets to 95c after only two minutes of gaming. Anyone who uses Furmark to measure thermal performance of a high end GPU at 40% fan speed better be ready with a fire extinguisher. o_O
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
Quoting Sushi:
290X almost never would sit at base state (barring terrible throttling or something). Last clocks I saw were ~960-980 on warmup and ~920 once at thermal throttle. Give it some extra fan speed (default is 34%) and it should always stay at 960-980. Give it some extra power budget and it should sit at 1000-1070 (max boost bin, might not stay there even with a giant power budget though. Stock voltages are very high for top DPM state, to accommodate for bad yield chips).

Remember that DPM states don't behave like a base/boost, it smoothly transitions inbetween states very rapidly and has the ability to scale anywhere between 300-1070, it's unlikely it would decide on 800MHz.

Edit: to above, Hawaii uses DPM states which have different voltages and clock speeds for each bin, and it decides what bin to used based on die activity, measured temperature, predicted temperature, current on rails, predicted current, VDDCI/memory power consumption, etc. to hit exactly max spec power usage (208w by default).
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
And how does Hawaii react to a power virus? We have no idea as of yet. But even if we assume it will react exactly like a reference 280X, subtracting 10% from the Furmark consumption would still leave it only 8% more power hungry than a Titan in gaming. Which in the whole scheme of things is pretty small. Certainly nothing like the 31%+ difference between the 5870 and 480 that people are trying to make a comparison to.

GTX480 used way more power in games than 31% vs. the 5870. The reason it becomes "only" 31-32% is once you add the rest of the system components. In reality, the 480 used 50-90% more power than 5870 (depending on the game) for an 18-20% increase in performance. I can't even believe people are comparing 480 vs. 5870 and then R9 290X vs. Titan in the same sentence in regard to power usage.

2x 5850s / 5970 used less power than a single 480.

gtx400_power.png


Again, 5970 used less power than a single 480 in games.

power-load.gif


At TPU, in Crysis 2 (a game, not Furmark), the 480 used almost double the power of a 5870. That's why people really zoned in on the power usage of 480 vs. 5870.

power_peak.gif


Hypothetical scenario: If R9 290X uses 255W of power in games, the 780 uses 222W and the Titan uses 238W, are people really going to be complaining about a 30W or so power consumption difference on a system that is likely to have an overclocked i5/i7 CPU and a premium mobo that itself uses 30-40W more power than lower ends boards?

Would someone take 8% more performance for 30W more power? What about lower price? etc. If the 780 used 180W and R9 290X used 260W, then sure that would be something to talk about.

It's pretty amazing that when people run out of things to talk about, they zoom in on the 30-40W power usage. Let's calculate the # of years it will take for the 780/Titan to break-even if R9 290X undercuts them both.:thumbsup:

Better yet, the R9 290X's little brother, the 290 OC, may very well deliver 90% of the performance of Titan OC for $500. That's what a lot of people are waiting to see.

Furmark is not even remotely indicative of actual thermal results from gaming. That test/review was from an R9 290X set at 40% fan speed. Is anyone going to suggest 40% fan speed in a Furmark test gives an accurate indication of the thermal cooling solution on a GPU?

Excellent post. After NV and AMD both explained that Furmark is not representative of any GPU's real world power usage, after all the destroyed 4870 and 280/285 cards that resulted from Furmark testing, and after full disclosure that NV/AMD built in special hardware/software GPU throttling, it still boggles my mind why people take Furmark seriously. Both Furmark and 3dMark need to die but it seems people have a hard time of letting go of what they are used to - the past status quo.

Even when I run Furmark on my 7970s and I open MSI Afterburner, the GPU is not pegged at 99%, but my power consumption is higher than in any game. Why is that? Because Furmark loads the VRMs and other parts of the graphics card that are not loaded in games to this extent. This is why it's called a power virus.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Well yea, talking about power consumption without performance numbers is irrelevant.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
The whole power argument is old, stale and irrelevant when you are discussing the best cards on the market. If it runs within a threshold of power that can be supplied to it, and most importantly, does not sound like a hair drier under load - then no one who buys these cards is going to lose sleep over another 20W or 30W.

This is fortunately not Intel putting out crappy 5% performance increases every generation and singing from the rooftops about how great their power consumption is. These are huge high end GPUs that cost in excess of $500+.

It's the same tired talking point you see the brand loyal go to when their brand of choice is not taking benchmarks. 'The card is faster!' 'Well it uses more power!' 'I just bought one for $600 and it's smoking fast, I don't care that it uses an extra nickel a month in electricity' :rolleyes:
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
The whole power argument is old, stale and irrelevant when you are discussing the best cards on the market. If it runs within a threshold of power that can be supplied to it, and most importantly, does not sound like a hair drier under load - then no one who buys these cards is going to lose sleep over another 20W or 30W.

This is fortunately not Intel putting out crappy 5% performance increases every generation and singing from the rooftops about how great their power consumption is. These are huge high end GPUs that cost in excess of $500+.

It's the same tired talking point you see the brand loyal go to when their brand of choice is not taking benchmarks. 'The card is faster!' 'Well it uses more power!' 'I just bought one for $600 and it's smoking fast, I don't care that it uses an extra nickel a month in electricity' :rolleyes:

Indeed. Wake me when there is a 50% difference or something drastic enough to notice.
 
May 13, 2009
12,333
612
126
I do care about power consumption. I run my pc in a cabinet and it can get toasty in there. I'd much rather run something a little less powerful with great power consumption vs a hair dryer that sucks watts.
I also am running a used 7850 I paid $100 for so obviously I'm not their target demographic.
 

wb182

Senior member
Nov 15, 2004
281
0
76
I care, i'm building a SFF system that can only fit a 450W PSU, so even an extra 30W from the 290x might knock it out of contention for me.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,732
432
126
I do care about power consumption. I run my pc in a cabinet and it can get toasty in there. I'd much rather run something a little less powerful with great power consumption vs a hair dryer that sucks watts.
I also am running a used 7850 I paid $100 for so obviously I'm not their target demographic.

Everyone should care about power consumption but things need to be put in perspective.

And it also depends where you are and the season.

Currently here (UK) people are bringing their laptops and their 3DS from the living room and sit in the PC room that is warm and cozy.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Power consumption may not be on top of the important totem pole but it is on it to me. Efficiency may matter to some for multi-gpu as well. To disregard or downplay is odd based on virtually every review investigates this. Personally allow the market to ultimately decide instead of vocal posters!
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It's the same tired talking point you see the brand loyal go to when their brand of choice is not taking benchmarks. 'The card is faster!' 'Well it uses more power!' 'I just bought one for $600 and it's smoking fast, I don't care that it uses an extra nickel a month in electricity' :rolleyes:

What isn't mentioned is that people some people buy 2-3 of these cards. Even if the NV card were to use 30W less power, most of the time these cards are sitting idle. AMD's ZeroCore will make up that 30W in idle on the other 2 cards for the remaining hours in the day when a person is not gaming. It's a wash.

What we want to see are Crysis 3, Far Cry 3, Metro Last Light, Tomb Raider and BF4 benchmarks of 780 max overclocked vs. R9 290X max overclocked.

I care, i'm building a SFF system that can only fit a 450W PSU, so even an extra 30W from the 290x might knock it out of contention for me.

That's not logical. If 30W of extra power is the difference between running your PSU at the limit and destroying it, you shouldn't be putting either the 780 or the R9 290X into it.

But, let's investigate this anyway. A system with i7 3770K @ 4.8ghz with 3 different 780s:

1377370038xConQyHEiO_10_1.gif


1376280388OaKRudmCMk_7_1.gif


1379389230TCG7Rpb3G3_12_1.gif


Based on this, if your PSU is only a 450W, it better be rated to operate at 50*C at 450W or it becomes very risky to run i7 OC + 780 OC. Looks like your best bet is to overclock your CPU without any voltage increases so that you can leave as much room as possible for 780/R9 290X overclocking.

Power consumption may not be on top of the important totem pole but it is on it to me. Efficiency may matter to some for multi-gpu as well. To disregard or downplay is odd based on virtually every review investigates this. Personally allow the market to ultimately decide instead of vocal posters!

Some of the comments you make are almost as if they came out of a PR handbook. They seem on topic but don't address anything in detail and ignore everything that was discussed prior to your comment.

Multi-GPU efficiency has already been addressed. NV doesn't have ZeroCore Power which means in the 18 hours or so you are sleeping + working, the AMD GPUs will save a ton of power vs. the 90W of more power a 780x3 may use during gaming over R9 290x x3. If you are going to talk about using more power during games, why aren't you discussing using less power when 3-4 of these GPUs are in idle?

Moreso, you keep making blanket statements about power consumption but the discussion is specifically about 30-40W power consumption differences between flagship products, NOT whether power consumption is irrelevant as a whole. It has already been mentioned by various posters if the power usage was 50% / 100W or something significant, it would matter. No one is arguing power consumption itself doesn't matter. You aren't looking at the context of what's being discussed.

Context: A person running an i7 3770 @ 4.8ghz is a PC enthusiast. The type of user who spends $600-700 on a GPU like MSI Lightning / EVGA Classified / Asus DCUII will most likely overclock those cards too. Why else did they pay a premium? Now looking at the benchmarks, overclocking the 780 alone without voltage control takes one into the 400W system power usage. Oveclocking the 780 with voltage control takes the system into 500W range.

You think a 30-40W difference matters at this point when the total system is drawing 400-500W vs. a card that may be 8-10% faster and/or cost less? You are not being realistic.

Someone else who is at or near the limit of their PSU will not be overclocking the CPU or the GPU. In that case any of the 780/R9 290X or Titan will work on a small form factor system with a 450W PSU.

And with cases such as SUGO 07 coming with a 600W PSU, someone building a SFF system now has a solution as well:
http://www.newegg.com/Product/Produc...82E16811163212

If R9 290 OC can compete with 780 OC but undercut it by $100-150, please tell us again how extra power consumption will matter? What's the break-even point in years on that extra power consumption?
 
Last edited:

maddie

Diamond Member
Jul 18, 2010
5,203
5,612
136
Power consumption may not be on top of the important totem pole but it is on it to me. Efficiency may matter to some for multi-gpu as well. To disregard or downplay is odd based on virtually every review investigates this. Personally allow the market to ultimately decide instead of vocal posters!
I have not seen anyone saying to disregard power consumption or temps, but that we should put it in perspective and not start screaming about minor variations as some here are accustomed to ranting, once it supports your vendor.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
Indeed. Wake me when there is a 50% difference or something drastic enough to notice.

This is the crux of it. If it's enough of a difference that you're looking at a situation like we saw with GTX 480 compared to 5870. An obscenely hot and loud card that made it painful to use in your computer, then we have something. When it's forumites niggling over 20-30W because their 'team' is not winning benchmarks, it's just a whine point to try and detract from the important metric of these cards which is performance.

First I was hearing AMD doesn't have the money and resources to make a GPU to outdo GK110, then it's that the new cards are all going to rebrands, then it's that the new flagship is a dual-gpu Pitcairn....

Now the reality is here that they managed to make a somewhat big die, still smaller than GK110, that is faster than the competitions main gaming flagship and as fast as their ultra niche obscenely priced card. So we will see the same brand loyalists with inconsistent opinions on what is important in a card that switches with whatever way the green wind is blowing.

You can go back to Fermi days and see the exact opposite being said about the relevancy of power, or Titan to see flip-flops on price/performance, GTX 680 on how flagship performance improvements are okay being halved etc. the same blow-hard nonsense :rolleyes:
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
And the market reacted strongly against the first iterations of Fermi with AMD's balanced architecture and time-to-market actually did retake over-all discrete share despite nVidia's strong brand!