videocardzFirst AMD Radeon R9 290X 1080p performance review

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Elfear

Diamond Member
May 30, 2004
7,169
829
126
Last time I checked, Crysis, which I mentioned as well, was a game. Absolute consumption numbers produced by Furmark are of little real world use. However, what are the odds that a 290x is 18% worse than a Titan in Furmark, and then uses less power, or is even roughly equivalent to a Titan in games? Pretty darn slim.

Not really that slim if you look at the way Nvidia and AMD handle Furmark.



280X Toxic consumes 3% more power than the 780 while gaming.



Furmark shows a 28% higher power consumption.

Furmark isn't a good comparison between the two brands.

Keep in mind, at the stock settings, the r290X is 160 Mhz higher clocked, and runs much hotter and draws much more power.

The Titan ultra just has to unlock the shaders to beat it, and add 160Mhz to the clocks to beat it soundly, and it still should not be any hotter or draw anymore power (the spec's have it drawing less than the Titan, of course these are rumors, not the end product).

Are you saying the average boost clock of the Titan is 840Mhz? If you're trying to make a comparison between the Titan's base clock and the 290X's boost clock, I'm not sure where you're going with that.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
Keep in mind, at the stock settings, the r290X is 160 Mhz higher clocked, and runs much hotter and draws much more power.

The Titan ultra just has to unlock the shaders to beat it, and add 160Mhz to the clocks to beat it soundly, and it still should not be any hotter or draw anymore power (the spec's have it drawing less than the Titan, of course these are rumors, not the end product).

Nvidia has already announced it'll be doing mass cuts soon, so this will allow for it to coexist with a Titan that is cheaper than today.

I am pretty much positive that the boost clocks of the titan and gtx780 are not being reported in those benchmarks we have seen. Also, the only temperature slide we have seen is from furmark. The only gaming power consumption slide I have seen shows that r9 290x using less power than the titan.

I'm waiting to see an actual review before I start cooking bacon on this GPU.
 

maddie

Diamond Member
Jul 18, 2010
5,203
5,612
136
More mature fab, pretty easy even without making minor changes which they could do. GF100 -> GF110 saw Nvidia cutting leaky transistors out of the design, decreasing transistor count, increasing SMX count, and noticeably increasing perf/w. Now it's not like GK110 suffers as GF100 had making changes yield sizable results... However they don't need sizable results in this case.

All they have to do to avoid competing with Titan is not enable 1/3 DP. GTX 780 already eclipsed Titan for gaming, and it's unlikely Nvidia feels this will threaten their Titan market (assuming there is even a market left).
A bit curious.

Why did you buy 7950 cards?
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
I am pretty much positive that the boost clocks of the titan and gtx780 are not being reported in those benchmarks we have seen. Also, the only temperature slide we have seen is from furmark. The only gaming power consumption slide I have seen shows that r9 290x using less power than the titan.

I'm waiting to see an actual review before I start cooking bacon on this GPU.


Yeah, When they post the boost clock its normally higher than what it's advertised. I never seen a Kepler card that reported its advertised speeds. Always at least 30-60mhz higher.
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
Not really that slim if you look at the way Nvidia and AMD handle Furmark.



280X Toxic consumes 3% more power than the 780 while gaming.



Furmark shows a 28% higher power consumption.

Furmark isn't a good comparison between the two brands.

No, it's not a good comparison when using an overclocked to the limit, we couldn't careless how much power this things uses card, vs a reference design. Anand clearly states why that Toxic card is using 50W more than a reference 280x and 109W more in Furmark, though you oddly left that part out. Comparing a reference 280x and a reference 780, the swing is not nearly as large. Less than 10%.
 
Last edited:

Elfear

Diamond Member
May 30, 2004
7,169
829
126
Yeah, When they post the boost clock its normally higher than what it's advertised. I never seen a Kepler card that reported its advertised speeds. Always at least 30-60mhz higher.

In their intial Titan review, Anandtech saw average boost clocks in the 966-992Mhz range. Or in other words about dead even with the rumored 290X boost clocks.

I believe the leaked benchmarks were just quoting the base clock of the Kepler cards. I can't imagine they locked them down to their base clocks to test against Hawaii.
 

Elfear

Diamond Member
May 30, 2004
7,169
829
126
No, it's not a good comparison when using an overclocked to the limit, we couldn't careless how much power this things uses card, vs a reference design. Anand clearly states why that Toxic card is using 50W more than a reference 280x and 100W more in Furmark, though you oddly left that part out. Comparing a reference 280x and a reference 780, the swing is not nearly as large. Less than 10%.

And how does Hawaii react to a power virus? We have no idea as of yet. But even if we assume it will react exactly like a reference 280X, subtracting 10% from the Furmark consumption would still leave it only 8% more power hungry than a Titan in gaming. Which in the whole scheme of things is pretty small. Certainly nothing like the 31%+ difference between the 5870 and 480 that people are trying to make a comparison to.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
In their intial Titan review, Anandtech saw average boost clocks in the 966-992Mhz range. Or in other words about dead even with the rumored 290X boost clocks.

You are of course presuming AMD still has the same terrible boost they have now, fixed.

Otherwise if it's like Kepler than 1000Mhz (rumored) would be guaranteed, and likely samples will boost above that.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
Pretty much spot on with the Titan boost clock observation. Both my Titans boosted to 1045, adding on another 140Mhz to that requires more voltage. For a theoretical 1185 you're definitely going to be using in most cases the 1.21 hard locked voltage, never mind many Titans can't even clock that high, so you'd have to go beyond that.

If you haven't seen what happens with GK110 once you start to go beyond the tame voltage you don't realize it gets really hot and uses a lot of power when you up the voltage and clocks.

This is why a Titan Ultra is nonsensical. A Titan core or full GK110 core on a 785 type card with a price that actually makes it competitive against a R9 290X is what makes sense if they even care about putting in the effort to regain the performance crown with a 10% lead.

Putting out a card for $1000 and saying hey we are 10% faster than the $600 290X is not going to win over many potential 290X buyers... The same way hardly anyone springs for a Titan any more now that the 780 is available. The card would need to make sense with what is on the market, the Titan only made a sliver of sense when it was the only game in town, now it makes none. Many after seeing what the 780 could do, such as myself, literally upgraded to aftermarket 780s from Titans and put some cash away from the deal.
 

Elfear

Diamond Member
May 30, 2004
7,169
829
126
You are of course presuming AMD still has the same terrible boost they have now, fixed.

Otherwise if it's like Kepler than 1000Mhz (rumored) would be guaranteed, and likely samples will boost above that.

True. I'm assuming the boost will be more like Tahiti's boost where the stated boost clock is what the card runs at 99% of the time under load. AMD may have totally revamped how boost works as sushiwarrior alluded to.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I am pretty much positive that the boost clocks of the titan and gtx780 are not being reported in those benchmarks we have seen. Also, the only temperature slide we have seen is from furmark. The only gaming power consumption slide I have seen shows that r9 290x using less power than the titan.

I'm waiting to see an actual review before I start cooking bacon on this GPU.

Yeah, with Kepler GPUs the boost clock is variable and changes depending on GPU TDP and temperature conditions if warranted (eg if temp is too high, it will change.). Every Kepler GPU has a different out of box boost speed. This is why the review makes mention of the base clock for the Kepler as a basis of comparison, it really is insanely stupid to think that any review website would intentionally gimp the GTX 780 or Titan in comparing both architectures at stock clockspeeds. I'm sure that didn't happen. I'm also sure these cards didn't boost to a mere 800 or something mhz. That is the base clock, not the boost clock.

AMD's boost is different and than Nvidia's verison of GPU Boost 2.0 (I like GPU Boost 2.0 better), but for the sake of this review, the base clock for the Kepler GPUs are being reported while the boost clock of the AMD card is being reported. Again - the AMD Boost has a static boost clock of 1000mhz (presumably) for the 290X. The Kepler GK104 and GK110 do not have static boost clocks - every Kepler GPU, even of the same SKU, will boost to different speeds depending on temp/TDP and silicon quality - it will boost to the highest speed possible given those conditions, but will vary. AMD boost does not vary in the same ways. That is why, it has become common for reviewers (TPU comes to mind) to explicitly state the base clock for Kepler GPUs in review comparisons.

Really, as I mentioned, it's pretty silly to think that the 780 and Titan cards were intentionally gimped. Every 680 and 780 I used boosted way past 1000mhz. So it's safe to assume that the reviewer made note of the GK110 base clocks.
 
Last edited:

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
And how does Hawaii react to a power virus? We have no idea as of yet. But even if we assume it will react exactly like a reference 280X, subtracting 10% from the Furmark consumption would still leave it only 8% more power hungry than a Titan in gaming. Which in the whole scheme of things is pretty small. Certainly nothing like the 31%+ difference between the 5870 and 480 that people are trying to make a comparison to.

It would depend what game is used for the comparison. Anand used Battlefield 3 in their 780 review. The Titan system actually used 8W less running Furmark than it did in BF3. Go figure. The 7090GE used 36W more.

That said, I agree that the 480 comparison has no merit as I said in my first post in this thread. It looks like the 290x will be pretty poor in the temp/power department, but not nearly as bad as the 480 was.
 
Feb 19, 2009
10,457
10
76
It looks like the 290x will be pretty poor in the temp/power department, but not nearly as bad as the 480 was.

*In Furmark.

We've yet to see it in gaming loads.

But I am not optimistic, since everyone knows AMD's past reference cooler is rubbish, especially for noise.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
So add more shaders, increase clocks and no power or temp increase. At least think it through before posting it.

After what happened with Titan=$1000 then 780=$650.. another $1000 card will get laughed at

I was looking over spec's posted in a review that actually lowered power consumption. Of course I also mentioned it was still in the rumor stages.

It can actually be done. You lower the voltage, increase the clocks (they only increased it by 60 btw, in that particular rumor).

And did you not see the news that Nvidia has already said they have price cuts coming soon? You have no idea what the price will be, other than it'll likely be higher than the revamped Titan price. It could be $1000 though, but we do not know.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Are you saying the average boost clock of the Titan is 840Mhz? If you're trying to make a comparison between the Titan's base clock and the 290X's boost clock, I'm not sure where you're going with that.

All I can do it go off the info provided in the review. Does it not list the 290x as 1000Mhz, and the Titan as 837Mhz?

Looking back, I see they are very inconsistent in the way they presented the benchmarks. They showed the 290x at boost clocks, and the Titan at base clocks.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
And did you not see the news that Nvidia has already said they have massive price cuts coming soon? You have no idea what the price will be, other than it'll likely be higher than the revamped Titan price. It could be $1000 though, but we do not know.

No, where is this news? I've only seen rumors. Is there a definitive statement about price cuts? Because I haven't seen them and I kinda doubt it. Small cuts perhaps, but nothing has been confirmed.
 

maddie

Diamond Member
Jul 18, 2010
5,203
5,612
136
I was looking over spec's posted in a review that actually lowered power consumption. Of course I also mentioned it was still in the rumor stages.

It can actually be done. You lower the voltage, increase the clocks (they only increased it by 60 btw, in that particular rumor).

And did you not see the news that Nvidia has already said they have massive price cuts coming soon? You have no idea what the price will be, other than it'll likely be higher than the revamped Titan price. It could be $1000 though, but we do not know.
Where?
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
http://www.anandtech.com/show/7401/nvidia-announces-gtx-660-gtx-650-ti-boost-price-cut
Ahead of things to come this week, NVIDIA has announced a preemptive price cut for a couple of their mainstream GeForce products. As of today, the GTX 660 is getting an official price cut to $179, which is down from the $200 or so prices that it was at a bit earlier this year. Meanwhile the GTX 650 Ti Boost is getting a price cut down to $149 for the 2GB model, and $129 for the 1GB model.

Interestingly, NVIDIA did take the time to reiterate that these are the only price cuts that are taking place. The GTX 760 and GTX 770 are not getting price cuts and will remain at $249 and $399 respectively.

Price cuts could happen but there was no "news" of such. It is all rumor, I haven't seen anything definitive. Although, GTX 700 price cuts would be great.
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
$315 each at the time vs $380 670s, Litecoin was profitable enough at the to pay for the second card in less than two months.

Odd that you haven't made a lot of the incredible added value that brought you.:confused:
Paid off your second card in 2 months eh?
That's a terrific outcome.:thumbsup:
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Massive wasn't the right word, but they have posted about price cuts on a few cards, and they likely will not stop there.

Sorry to get you overly excited. I actually was trying to type mass, instead of massive, but my auto-type in me wrote out massive.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Why does everyone keep bringing-up the 512-bit memory bus? The only thing that matters is effective bandwidth. We know what a Titan often can OC to, we don't know about the 290x (yet). All we DO know is that it has about 10% more bandwidth by default (not a game-changer IMHO).

We don't know the RAM timings though. We might never know, since that's not a spec they typically give. When I first saw the "slow" speeds though, I thought that maybe they were running tighter timings.
 

moonbogg

Lifer
Jan 8, 2011
10,734
3,454
136
I wouldn't put it past myself to sell my 670's for one of these if it OCs well enough to pass Titan levels. That would put it right around 670sli performance with no muti GPU issues and plenty of Vram and at only about a $200 cost after selling the 670's. Haven't had an ATI/AMD card since 9700 pro. Bring it.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Not sure, but I do remember AMD 'Enthusiast' given Nvidia crap over the Fermi 480. I doubt people will give AMD the same amount of flack, but the way the AMD nutters went about themselves does make it somewhat ironically funny to see the 290X in a similar position. Now the shoe is on their foot, but of course it prolly won't be a big deal to them now since the card is obviously fast enough to top Nvidia's top card atm.

People already have given AMD grief about power usage with the 7970. It will happen with the 290X as well. Even if it only uses a few more watts.