videocardzFirst AMD Radeon R9 290X 1080p performance review

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

iiiankiii

Senior member
Apr 4, 2008
759
47
91
This is the crux of it. If it's enough of a difference that you're looking at a situation like we saw with GTX 480 compared to 5870. An obscenely hot and loud card that made it painful to use in your computer, then we have something. When it's forumites niggling over 20-30W because their 'team' is not winning benchmarks, it's just a whine point to try and detract from the important metric of these cards which is performance.

First I was hearing AMD doesn't have the money and resources to make a GPU to outdo GK110, then it's that the new cards are all going to rebrands, then it's that the new flagship is a dual-gpu Pitcairn....

Now the reality is here that they managed to make a somewhat big die, still smaller than GK110, that is faster than the competitions main gaming flagship and as fast as their ultra niche obscenely priced card. So we will see the same brand loyalists with inconsistent opinions on what is important in a card that switches with whatever way the green wind is blowing.

You can go back to Fermi days and see the exact opposite being said about the relevancy of power, or Titan to see flip-flops on price/performance, GTX 680 on how flagship performance improvements are okay being halved etc. the same blow-hard nonsense :rolleyes:

+1
That pretty much sums it up. Fanboys sure make these forums a lot more lively.

Power consumption isnt too important with these tier 1 cards. Unless, of course, it's outrage. To be honest, this round of GPUs aren't that great from both sides. A node shrink can't come soon enough. I want to see double in performance, not 30%.
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
Will! Your back just in time. ;)

Good to see you again 3D :cool:

As for the timing....:sneaky:


I'm actually pretty impressed with the performance increase the new Hawaii chip seems to have over GCN 1 Tahiti.
Considering its the same process node it seems they have come up with a pretty good jump in horsepower,yet not had to build a very big die like Kepler/Titan.
Really looking forward to retail price and performance numbers from R9 290/X:thumbsup:
 

AdamK47

Lifer
Oct 9, 1999
15,845
3,637
136
It's pointless to talk about furmark and power, both companies throttle in it.

Apparently not as much on the 290X. GK110 will start throttling at around 80C using default settings. The graph shows the 290X reaching 94C which indicates that it ran at higher clocks longer at increased temps. I would like to see the temperature throttle limit of the GK110 cards raised to maxed and then the test run. I'm sure both cards would score higher in Furmark.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
AMD cards don't have a built in furmark limiter at all except for the powertune slider. If you set it to 20%, AMD cards will go full retard in furmark using a metric crap ton of power consumption while essentially ignoring TDP, and way more than a gaming load would use. That's from rough recollection with the 7970s I used way back when.

I wouldn't come to conclusions just yet about power consumption with just a furmark result. Remember, the early chiphell leaks showed better than titan power consumption while the rest of the benchmark results were in line with what we've seen from EXPreview. When power consumption from games is tested, that would be a better conclusion since AMD and nvidia cards do not treat furmark equally, especially when powertune is manipulated. That said, power consumption could be an issue but I certainly wouldn't draw that conclusion from a furmark result - I remember the 7970s were pretty ridiculous in furmark at 20% powertune. When sites test power use in games, I think that would be a better means of drawing a conclusion.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
Furmark is notoriously dangerous on AMD GPUs since they don't throttle anywhere like Nvidia GPUs.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
And the market reacted strongly against the first iterations of Fermi with AMD's balanced architecture and time-to-market actually did retake over-all discrete share despite nVidia's strong brand!

Do you get paid for throwing in a bunch of fancy language? Do you know the meaning of the phrase "significant/material differences"?

Crysis 2
HD5870 used 143W of power
GTX480 used 272W of power

GTX480 used 129W more power OR 90% more power than the 5870 after being 6 months late.

The market reacted because GTX480 used nearly double the power in some games and it cost $499 when 5870 cost $369, or 35% more.

How is this at all related to R9 290X vs. 780?

Will R9 290X cost 35% more?
Will R9 290X use 90% more power than GTX780?

The context is R9 290X undercutting 780 with a possible 30-40W power consumption difference, not 129W, not 90%.

Do you work in PR or in politics? You never answer anything directly and never address in detail what's being discussed. Your responses do not address the author's main points. It's almost like a computer automatically generates replies around a topic.

You know what else is interesting, I've been reading the forums for a long time and I don't remember you making any fuss at all about how GTX275/280/285 used a lot more power over HD4890 despite 2 of those cards failing to outperform it.

Look at that GTX275. For crying out loud, the GTX260 216 barely used less power than HD4890 and got trashed by it.

power_peak.gif
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Christ, stop talking about the GTX 480. Is this 2010 again?

The bottom line is that furmark is not a valid power consumption test, because AMD cards do not treat furmark the same way as NV if powertune is manipulated. The reasonable analysis will happen when the 290X is tested in actual games for power consumption. As I mentioned, the earlier chiphell leaks indicated slightly better than Titan power consumption in games. So, we'll see.

End of story. Stop talking about the GTX 480, Nobody cares, so old and tired.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Crysis 2
HD5870 used 143W of power
GTX480 used 272W of power


lol as if every 480 was such a hog. There are plenty of resources besides release review samples that show a clear downward trend in consumption as Nvidia moved further into the node.

If you need a clear example my 850-900MHz on reference air 470s should be more than enough.

Also in Crysis 2 143w was unplayable, whereas 272w peak was playable, doesn't really matter if it was 20w and 500w if only one could achieve respectable performance with high settings.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
lol as if every 480 was such a hog. There are plenty of resources besides release review samples that show a clear downward trend in consumption as Nvidia moved further into the node.

I bet you looked for aftermarket 7970 ghz power consumption and not bash amd because of early review samples that were never available in consumer market.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Christ, stop talking about the GTX 480. Is this 2010 again?End of story. Stop talking about the GTX 480, Nobody cares, so old and tired.

Sorry, I really have no interest in discussing 480. My point is the hypocrisy of picking and choosing when power consumption matters and when it doesn't seems to swing on the green side. It sure didn't matter during GTX275/280/285/470/480 days but now if R9 290X uses 30-40W more than 780, it's relevant. :rolleyes:

If we have an i7 3770K @ 4.8ghz + 780 using ~ 350W vs. i7 3770k @ 4.8ghz + R9 290X using 390W, but what if R9 290X is 7-8% faster and costs $50-75 less?

I bet you looked for aftermarket 7970 ghz power consumption and not bash amd because of early review samples that were never available in consumer market.

Pretty much, but Balla ignored all after-market 7970s, their overclocking via specific voltage levels, noise levels, temperatures. For 18 months straight he only linked reference 7970GE cards.

Anyway, with GTX770 4GB selling for $440-500, let's hope R9 290 ends up at $499 and finally forces NV to drop prices on 770 carsds about $140-175. If AMD releases R9 290X at $549-579 and it beats 780s, that would be awesome too.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
That wasn't in Crysis 2 though!

Your hypothetical can only lead to one conclusion, you waited 6 months for 7.5% more performance for $50 less.

Likewise the conclusion should be the same you've had in the past, the 290x would add nothing, it doesn't greatly increase price/perf, it doesn't add any new tier of performance, it's late and will have little to no impact on the market.

Theoretically, of course.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
That wasn't in Crysis 2 though!

Your hypothetical can only lead to one conclusion, you waited 6 months for 7.5% more performance for $50 less.

Likewise the conclusion should be the same you've had in the past, the 290x would add nothing, it doesn't greatly increase price/perf, it doesn't add any new tier of performance, it's late and will have little to no impact on the market.

Theoretically, of course.

Highest end card never really has any impact. R9 290 will though, just like 7950s did, just like 6950s did. Those cards definitely had impact. :wub:
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
For AMD that is unlikely if it's priced at $500.

7950 never had much of an impact until it was well below the price of the 670 around $300.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
I don't care about power consumption, i still think that GTX480 was a great card regardless of it's shortcomings. Performance takes the leading role in my book. Now if we're talking HTPC stuff then the pic changes
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
lol Nvidia pays me for my devotion, I can see pro/con for each. I can also see AMD crowd cringing at the price of the 780/Titan and find it highly unlikely they'll cave for AMD's next 9590.

Fans aside of course.

I went for the 7950 over the 670 because the Nvidia premium was $70 and it didn't include 3 games. Plus I've been jaded by Nvidia this entire generation with $500 mid-range and a year late GK110 with a huge price tag.

My devotion only goes so far.


As far as power consumption goes, I never cared with water but I care now. It's one of the biggest reasons I don't use my second card often (that and it won't work right unless you disable ULPS 90% of the time). Power is something I like with my 7950, because reviewed with 1.25v it still does well, it does really well when you undervolt it.
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
All the benches are worthless to me. How man people here who are shopping in the U$650+ range are rocking single 1080P monitors. At worst they have a 120Hz 1080 monitors.

When we see what this thing can do at 1440p, 1600p and 5760x1080 then we can come to a conclusion about how much value it is.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Do you get paid for throwing in a bunch of fancy language?

Ultimately the market decides! My constructive nit-pick with the first iterations with Fermi was performance per watt and good to see Kepler improve upon this.

Thanks for the question but have the ability to think for myself and know what is important to me based on my subjective tastes, tolerances and thresholds and personally allow the market to decide who wins or decides for over-all! I'm not smart enough to think for the marketplace as a whole and over-all!
 

InfoTiger

Golden Member
Sep 10, 2004
1,186
2
91
All the benches are worthless to me. How man people here who are shopping in the U$650+ range are rocking single 1080P monitors. At worst they have a 120Hz 1080 monitors.

When we see what this thing can do at 1440p, 1600p and 5760x1080 then we can come to a conclusion about how much value it is.


Spliter Cell 6 @ 2560x1600

R9-290X-1600p.jpg
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Your hypothetical can only lead to one conclusion, you waited 6 months for 7.5% more performance for $50 less.

I didn't because I have no intention of buying the R9 290/X. I am not upgrading until I get 75-100% increase at minimum. For anyone who bought a 780, the R9 290X is not worth waiting for. The R9 290/X is for people who are buying a new GPU for say BF4/Holiday season gifts.

Likewise the conclusion should be the same you've had in the past, the 290x would add nothing, it doesn't greatly increase price/perf, it doesn't add any new tier of performance, it's late and will have little to no impact on the market.

I am not singing praises of the R9 290X. I am more looking forward to R9 290 if it can overclock well and come in at $499. Then this card will change the landscape.

I have already stated repeatedly on the forums that to me it was not worth waiting for R9 290X for someone who was looking to drop $600+ on a GPU. If I was in the market for a GPU, I would have bought a 780 a while back but there are no next gen PC games that warrant such an expense yet imo.

Anyway, even if AMD doesn't beat GTX780 by significant amounts, which I doubt it will since 780 can overclock 20-30% on air, it at least forces NV to keep innovating and/or dropping prices. Having NV be uncontested for so long on the high end allows NV to charge astronomical prices. Although now that AMD has launched R9 280X and NV has done nothing to drop prices on 760/770 cards, it seems NV may not even react at all to R9 290's launch.

Ultimately the market decides! My constructive nit-pick with the first iterations with Fermi was performance per watt and good to see Kepler improve upon this.

OK, but it has nothing to do with R9 290X vs. 780 since the performance/watt and the differences in power consumption will be nothing like they were between 470/480 and 5850/5870. So what point are you making exactly....? That Kepler is more efficient than Fermi? What's that have to do with R9 290X?

Thanks for the question but have the ability to think for myself and know what is important to me based on my subjective tastes, tolerances and thresholds and personally allow the market to decide who wins or decides for over-all! I'm not smart enough to think for the marketplace as a whole and over-all!

So you are saying that 30-40W power consumption is a deal breaker to you when spending $600+ on a GPU and that this overrides the performance differences between the 2 GPUs?

What are you saying exactly? No one here still knows. No one here claimed that power consumption or performance/watt do not matter. We are talking about 30-40W hypothetical power consumption difference on flagship GPUs in modern overclocked systems that already use 350W+ and reach 500W once overclocked.

Again, if you want to specific then, say that for you 30-40W power consumption is a big deal. Then we wouldn't argue. Instead, you lead the discussion into a general statement that power consumption and performance/watt matter and then connect it with how the market reacted to these aspects using an example of Fermi vs. Cypress -- GTX480 vs. 5870. Well, in that case you are making a direct connection between those 2 cards and R9 290X vs. 780, but this connection is not relevant in either performance/watt, overall power consumption delta or pricing. So not sure again what your point is other than "let the market decide".

The market already decided that GTX770 2-4GB is a good buy at $399-500. What then, if 1000 people jump off a bridge, I should follow them? That's your let the market decide mantra at work. Same thing as people claiming that because Honda Accord/Civic and Toyota Corolla/Camry are best selling cars, they are actually good cars. The most popular products may or may not be excellent products. The market is not always efficient. You should know this if you went to business school. Therefore, let the market decide mantra only works if the market is efficient and consumers are informed & rational - in many cases the market fails.
 
Last edited:

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
Titan and the 780 are usually much close together. Seems fishy.

If you don't know the boost clocks for either card how can you call it fishy? Maybe the 780 just has bad TIM on it's heatsink?

I hate them posting the base clock. The Titan and 780 could be boosting up to around 1100 core depending on what they have done to them, which would make the 290x even more impressive.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
If you don't know the boost clocks for either card how can you call it fishy? Maybe the 780 just has bad TIM on it's heatsink?

I hate them posting the base clock. The Titan and 780 could be boosting up to around 1100 core depending on what they have done to them, which would make the 290x even more impressive.

Because in all other tests from this particular preview the results are much more in line with what is to be expected - that is a ~ 10% difference between the 780 and Titan. Certainly you don't think that the TIM affects one benchmark but not the next 5 mins later.

The 780s and Titan are not boosting higher than 1 GHz at maximum. Otherwise it would be overclocking and that certainly would have been noted on the diagrams.