[techreport]AMD issues statement on R9 290X speed variability, press samples

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

NTMBK

Lifer
Nov 14, 2011
10,519
6,028
136
That and the magic golden review samples.

I do wonder why the review samples did better. I strongly suspect that it is because of physical assembly, and not electrical characteristics.

If I remember correctly, press samples are often put together by hand by the PR team, so that they can get finished boards into reviewers' hands in time for an actual product launch. This requires short-circuiting the usual production cycle, so instead of waiting for the part to go through the entire process of having cooler mounted, packaged up, etc, they will just grab a finished board and mount the cooler themselves. It could well be that the PR team's standards for cooler mounting (and potentially even the quality TM used) are higher than the actual production line level.

I seriously doubt that AMD would intentionally grab golden-sample dies with unusually excellent electrical characteristics, for one simple reason- the exact same blowback that we are seeing in this thread.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
I do wonder why the review samples did better. I strongly suspect that it is because of physical assembly, and not electrical characteristics.

If I remember correctly, press samples are often put together by hand by the PR team, so that they can get finished boards into reviewers' hands in time for an actual product launch. This requires short-circuiting the usual production cycle, so instead of waiting for the part to go through the entire process of having cooler mounted, packaged up, etc, they will just grab a finished board and mount the cooler themselves. It could well be that the PR team's standards for cooler mounting (and potentially even the quality TM used) are higher than the actual production line level.

I seriously doubt that AMD would intentionally grab golden-sample dies with unusually excellent electrical characteristics, for one simple reason- the exact same blowback that we are seeing in this thread.

Just to expand a bit on this, anecdotal evidence does seem to indicate that assembly of retail cards is somewhat lacking in quality:

http://www.tomshardware.com/reviews/radeon-r9-290x-thermal-paste-efficiency,3678.html
 
Feb 19, 2009
10,457
10
76
It cant be any news to AMD. When you make such a huge base/boost delta. Its because the chip variance is so big that 1Ghz is essentially a magic useless number for PR only, that the broad amount of cards* cant do within the regulated specs.

* In QUIET MODE.

/thread.

ps. Regulated specs? Lol, Uber Mode is covered in warranty, its actively encouraged for users to use it for more performance. Also, overdrive is a second away for users to lift the fanspeed to whatever they want.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
* In QUIET MODE.

/thread.

ps. Regulated specs? Lol, Uber Mode is covered in warranty, its actively encouraged for users to use it for more performance. Also, overdrive is a second away for users to lift the fanspeed to whatever they want.

And you simply underline the statement. That you as a user needs to actively change its behaviour. Why isnt "ubermode" default then? And quiet mode optional?

But even in ubermode it cant keep the clocks. The only benefit AMD got right now with these cards is that its winter.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Nowhere in this press release do they state that they intend to fix it. The 15% performance variance at factory defaults found by Tom's and SKYMTL between press and retail cards? It's cool, apparently. They admit the problem and admit that there is too much variance. Yet won't fix it.
 
Last edited:

Mistwalker

Senior member
Feb 9, 2007
343
0
71
Nowhere in this press release do they state that they intend to fix it. The 15% performance variance at factory defaults found by Tom's and SKYMTL between press and retail cards? It's cool, apparently. They admit the problem and admit that there is too much variance. Yet won't fix it.
They said they are looking into it and will report back.
The range of performance differential is not expected to meaningfully change the user experience but we’ve taken note of recent reports that the degree of variability is higher than expected. Reasonably we would expect the variability to occur both above and below the performance of the press samples, however it appears that most reported performances are biased towards the low side. We are actively investigating these reports and we will update when we have completed our investigation.
 

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
How AMD is supposed to fix something that is fundamentally broken - their reference cooler having disastrous noise and thermal characteristics at wattage needed to dissipate?

AMD and NV fans can go back and forth all day long, but for AMD the only fix possible to reduce variance is unbounded rise of fan speeds instead of dropping clock speed. And they can't do that with current cooler and operating already @95C because of ridiculous noise that would result. Too loud operation is very noticeable and "measurable", while variance detection requires large samples.

So once you realise and accept this simple fact, everything else is crystal clear.
 
Feb 19, 2009
10,457
10
76
No, once you realize the fact that the card is noisy and can deal with it, you can then make a decision to purchase it. If you cannot deal with the noise, you will get more throttling, period.

There's an entire massive R290/X review thread which says exactly what you all have already said, many many times already. If noise is an issue, buy NV or other other cards (including AIB designs coming soon). Stop bashing a dead horse.
 
Feb 19, 2009
10,457
10
76
And you simply underline the statement. That you as a user needs to actively change its behaviour. Why isnt "ubermode" default then? And quiet mode optional?

But even in ubermode it cant keep the clocks. The only benefit AMD got right now with these cards is that its winter.

Why? You can ask AMD, do I as an owner for 2 of these cards care that quiet mode is default? Nope. Took me a second to change it.

Suddenly having 2 CHOICES (more with overdrive settings) for users is bad? Hey, that's news.

ps. It's summer here in Australia (and the entire other half of the globe).
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
Nowhere in this press release do they state that they intend to fix it. The 15% performance variance at factory defaults found by Tom's and SKYMTL between press and retail cards? It's cool, apparently. They admit the problem and admit that there is too much variance. Yet won't fix it.

Tom's said the same as TR, with the driver update the variance is down to 6% and yet again, SKYMTL isn't ready to publish his data but you're more than willing to make a mistake that fits your agenda.

This, all this huge fuss you're making, is about quiet mode. Something that you won't experience in uber mode. Quiet mode has wild variations based on temps as it is supposed to keep the noise down. A room temp going from 23c to 24c yields a 5.4% difference just in the enviroment variable.

When you're in quiet mode you give up performance for acoustics based on temps and a blower speed ceiling. You don't seem to understand that variance is to be expected like it was with the Nvidia implementation that you weren't able to bypass without hacks. You can remove all the variance just flipping a switch.

So all this comes down to you being so obnoxious about a 6% tops and calling it again and again a 15%. What you can't do is call these cards at the same time unreliable and loud. Pick one or the other because you can remove the variance or the loudness with a single switch and aid it with software.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Why? You can ask AMD, do I as an owner for 2 of these cards care that quiet mode is default? Nope. Took me a second to change it.

Suddenly having 2 CHOICES (more with overdrive settings) for users is bad? Hey, that's news.

ps. It's summer here in Australia (and the entire other half of the globe).

I'm pretty sure flipping as switch is easier than even installing the card. So I don't know what the big deal is. Also. Where the heck is this 15% coming from? I don't see that anywhere. SKYMTL hasn't published anything so thats not evidence. All the other sites that have published something after AMD fixed it with a driver update have shown about 5% difference. Which is an ok varience to have. It's about the same as what nvidia has.
 

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
No, once you realize the fact that the card is noisy and can deal with it, you can then make a decision to purchase it. If you cannot deal with the noise, you will get more throttling, period.

The problem is not the card being noisy, cause that was already established long ago, rubbish cooler is rubbish.

The problem is that for end user it is impossible to estimate perf/noise ratio and how it compares with other cards in market. They look at some review, "hey this card is only XdB louder that other card, no big deal", only to realize that they get substantially lower performance.

It is this wide gap between cherry picked card and potentially unlucky draw in silicon lottery for end user is what is either fraud (Nvidia fan version) or not "clamping performance to a least-common-denominator type of capability level" (AMD PR version).
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
The problem is not the card being noisy, cause that was already established long ago, rubbish cooler is rubbish.

The problem is that for end user it is impossible to estimate perf/noise ratio and how it compares with other cards in market. They look at some review, "hey this card is only XdB louder that other card, no big deal", only to realize that they get substantially lower performance.

It is this wide gap between cherry picked card and potentially unlucky draw in silicon lottery for end user is what is either fraud (Nvidia fan version) or not "clamping performance to a least-common-denominator type of capability level" (AMD PR version).

They have the same situation with nvidia cards. This whole thing sucks, We all wish the cards (all) cards perform as fast as possible at all times, but competition is so fierce that AMD and Nvidia try and gain any edge possible. Intel and AMD cpus would be the same if it were this close.

Right now, It's not as bad as people I making it out to be. Not to me anyway. I can live with losing 2-3fps if I get a bad card. It's no big deal really.
 

iiiankiii

Senior member
Apr 4, 2008
759
47
91
I really wish GPUs would go back to static clock speed and get rid of this boost nonsense. If you want more performance, oc the card. Boost is a truly gimped oc mechanic. More headaches for both overclockers and consumers when you want to get a more precise performance measurement. Boost skew the results between the cards.
 

KCfromNC

Senior member
Mar 17, 2007
208
0
76
Its because the chip variance is

+/-3% or so, based on the the review sites I've seen.

How does this compare against similar nV cards? We don't know, because nV only paid to have AMD cards tested by TechReport - and presumably the other review sites as well.
 
Last edited:

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
I really wish GPUs would go back to static clock speed and get rid of this boost nonsense. If you want more performance, oc the card. Boost is a truly gimped oc mechanic. More headaches for both overclockers and consumers when you want to get a more precise performance measurement. Boost skew the results between the cards.

Everything has boost clocks, not just GPUs. It's not something which is going away, and they make sense because of the nature of workloads.
You are thermal and power limited with high end GPUs typically, but if you can increase clockspeed because the workload isn't using all your resources, you can do that task faster, but you want to be able to make sure you don't fry the card if you run a different workload which stresses different parts of the card.

Boost speeds are about maximising the performance of a card without having to overclock. Do you think that Intel should remove their turboboost on CPUs when you are using fewer than all of your cores?
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
They have the same situation with nvidia cards. This whole thing sucks, We all wish the cards (all) cards perform as fast as possible at all times, but competition is so fierce that AMD and Nvidia try and gain any edge possible. Intel and AMD cpus would be the same if it were this close.

Right now, It's not as bad as people I making it out to be. Not to me anyway. I can live with losing 2-3fps if I get a bad card. It's no big deal really.

The situation is not the same, it's not even close to being the same. With the Kepler GPUs:

1) There is a guaranteed base clock, which is typically 100 under the boost clock. Conversely, the 290 is throttling up to 300mhz under the boost and DOES NOT have a guaranteed base clock - the boost isn't guaranteed either. It's an "up to" boost.

2) Kepler boost is exceeded out of the box. Every Kepler GPU will boost WELL PAST what the specified boost on box is - for instance, the GTX 780 advertises 914mhz boost. Every Kepler GPU boosts WELL PAST this out of the box.

3) Kepler GPUs, because of #2, will generally ALWAYS be higher than the advertised boost in a demanding game.

4) Kepler GPUs generally throttle 1-2 bins at most, which is 13-26mhz. Conversely, you have the 290 throttling by 300mhz at factory defaults. Not even close to the same situation.

Because of these reasons, Kepler GPUs perform consistently. The 290 and 290X do not, in fact, perform consistently: what's worse is that "quiet" mode 290 and 290X are in fact louder than Kepler GPUs at 100% GPU load, and while the Kepler can maintain 100% performance the 290 cannot maintain 100% performance at the factory default settings. You can lose around 18% performance, and this is ignoring the fact that every sample performs differently at retail.

clockspeed1.png


That is the 290 throttling situation.

Here's Kepler:

clockspeed3.png


clockspeed4.png


Multiple review websites noted that Kepler GPUs perform consistently within 1% of each other due to the fact that Kepler boosts higher than advertised out of box, including Toms, Hardwarecanucks, and PCper. So don't deflect and try to paint GK110 in the same light: Don't try to say that Kepler GPUs have 290 levels of variance, because they don't. And that's the problem. The competing product performs consistently at factory defaults while still being better acoustically than the 290X, while the 290X at factory defaults has wild variances and throttles WAY more.

IMO, this entire press vs retail performance variance is a mess. Apparently AMD cannot fix the issue, so how can anyone look at quiet mode benchmarks seriously? Any benchmarks on the web aren't indicative of what someone may get at retail. So AMD's answer is to flip the BIOS switch. Well, that's great, but then does AMD put a disclaimer on all of their GPUs indicating that there is no guaranteed performance at factory defaults? Do they change the defaults?
There's no easy fix here, and this is ignoring the fact that for SOME REASON no two cards perform alike at retail. I still don't understand how the press firmware caused retail cards to crash.

In fact, if someone wants to explain how the cherry picked press cards somehow have a BIOS that has higher clockspeeds, lower voltages, AND that same BIOS causes retail cards to CRASH? How is that AMD not cherry picking and not cheating? Seems pretty obvious at this point. They didn't want people reading reviews to see the real story on factory default performance levels at launch....
 
Last edited:

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
Everything has boost clocks, not just GPUs. It's not something which is going away.

Yeah, but revieving and benchmark(et?)ing community is only now starting to react. Right now there is big incentative to cheat ( higher bench scores and sales) and low risk when getting cought.

But things are changing, just recently mobile devices got removed from some bench database for boosting to clocks unseen in real usage. It's the same with video cards really, having to issue nth statement is not healthy and leaves you vulnerable if competition decides to really invest into some bad mouth campaign.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
We've gone over this multiple times on multiple threads. The Nvidia boost is no better -

http://translate.googleusercontent....s.html&usg=ALkJrhiJAK2jHXEthT5kzRZIeIdQdXn2JQ

The Kepler GPU's DO have the same level of performance variance depending on temperatures. 19% in Anno 2070 after a warmup.

Here are 2 examples with Anno 2070 and Battlefield 3 with a rapid test, a test temperature stabilized after 5 minutes and the same test as the latter but with 2 120mm fans positioned around the map:

Anno 2070: 75 fps -> 63 fps -> 68 fps *** 19% PERFORMANCE DIFFERENCE AFTER 5 MINUTES WARM-UP ***
Battlefield 3: 115 fps -> 107 fps -> 114 fps

3DMark Fire Strike: 889/1006 MHz
Unigne Heaven 4.0: 876/954 MHz
Unigne Valley 1.0: 876/954 MHz
Alan Wake Max 1920: 876/993 MHz
Alan Wake 2560 Very High: 837/928 MHz
Alan Wake Max 2560: 876/954 MHz
Anno 2070: 837/902 MHz
Assassin's Creed MSAA4x March 1920: 876/967 MHz
Assassin's Creed March 2560 FXAA HQ: 850/941 MHz
Assassin's Creed MSAA4x March 2560: 850/928 MHz
Batman Arkham City AA8x: 876/993 MHz
Batman Arkham City 2560 AA4x: 876/954 MHz
Battlefield March 1920 MSAA 4x: 876/993 MHz
Battlefield March 2560: 850/928 MHz
Battlefield March 2560 MSAA 4x: 876/954 MHz
Civilization V AA8x 1920: 850/928 MHz
Civilization V AA4x 2560: 837/902 MHz
Civilization V AA8x 2560: 863/954 MHz
Crysis February 1920 Ultra: 876/993 MHz
Crysis February 2560 Extreme: 876/954 MHz
Crysis February 2560 Ultra: 863/967 MHz
DiRT Showdown 1920 Ultra: 876/954 MHz
DiRT Showdown 2560 Ultra AL OFF: 837/928 MHz
DiRT Showdown 2560 Ultra: 863/941 MHz
Far Cry AA4x March 1920: 876/954 MHz
Far Cry March 2560: 850/928 MHz
Hitman Absolution: 876/993 MHz
Max Payne 3 4x AA: 876/954 MHz
Max Payne March 2560 NOAA: 850/941 MHz
Sleeping Dogs: 837/902 MHz
The Witcher 2 Enhanced Edition: 876/954 MHz
Total War Shogun February 1920 MSAA4x: 837/928 MHz
Total War Shogun February 2560 MLAA: 837/928 MHz
Total War Shogun February 2560 MSAA 4x: 837/901 MHz
Not ONE game maintains it's boost after a 5 minute warm-up period. The only difference is it's not being reported by the English speaking press.

Sure you can claim Nvidia always reaches it's base clock but the fact remains that in reviews, the cards are shown to be faster than what they are under normal operation, which is the REAL problem not some pointless "hardly ever reaches max boost in QUIET MODE" issue.
 
Last edited:

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I found one of anand's reviews fascinating about this generation of AMD cards. AMD got more performance boost out of upping the fan speed than increasing clocks.
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
1) There is a guaranteed base clock, which is typically 100 under the boost clock. Conversely, the 290 is throttling up to 300mhz under the boost and DOES NOT have a guaranteed base clock - the boost isn't guaranteed either. It's an "up to" boost.

Do reviewers test the cards with "base clocks"? Thought so.

I found one of anand's reviews fascinating about this generation of AMD cards. AMD got more performance boost out of upping the fan speed than increasing clocks.

The same was said about Kepler at launch. Fan slider OC when not TDP limited.
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
Why can't AMD just take a page from Nvidia and release a card without a god-awful reference cooler for once?

After this, I am pretty certain they will. In the end, AMD will come out the better for it.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
The situation is not the same, it's not even close to being the same. With the Kepler GPUs:

1) There is a guaranteed base clock, which is typically 100 under the boost clock. Conversely, the 290 is throttling up to 300mhz under the boost and DOES NOT have a guaranteed base clock - the boost isn't guaranteed either. It's an "up to" boost.

2) Kepler boost is exceeded out of the box. Every Kepler GPU will boost WELL PAST what the specified boost on box is - for instance, the GTX 780 advertises 914mhz boost. Every Kepler GPU boosts WELL PAST this out of the box.

3) Kepler GPUs, because of #2, will generally ALWAYS be higher than the advertised boost in a demanding game.

4) Kepler GPUs generally throttle 1-2 bins at most, which is 13-26mhz. Conversely, you have the 290 throttling by 300mhz at factory defaults. Not even close to the same situation.

Because of these reasons, Kepler GPUs perform consistently. The 290 and 290X do not, in fact, perform consistently: what's worse is that "quiet" mode 290 and 290X are in fact louder than Kepler GPUs at 100% GPU load, and while the Kepler can maintain 100% performance the 290 cannot maintain 100% performance at the factory default settings. You can lose around 18% performance, and this is ignoring the fact that every sample performs differently at retail.

clockspeed1.png


That is the 290 throttling situation.

Here's Kepler:

clockspeed3.png


clockspeed4.png


Multiple review websites noted that Kepler GPUs perform consistently within 1% of each other due to the fact that Kepler boosts higher than advertised out of box, including Toms, Hardwarecanucks, and PCper. So don't deflect and try to paint GK110 in the same light: Don't try to say that Kepler GPUs have 290 levels of variance, because they don't. And that's the problem. The competing product performs consistently at factory defaults while still being better acoustically than the 290X, while the 290X at factory defaults has wild variances and throttles WAY more.

IMO, this entire press vs retail performance variance is a mess. Apparently AMD cannot fix the issue, so how can anyone look at quiet mode benchmarks seriously? Any benchmarks on the web aren't indicative of what someone may get at retail. So AMD's answer is to flip the BIOS switch. Well, that's great, but then does AMD put a disclaimer on all of their GPUs indicating that there is no guaranteed performance at factory defaults? Do they change the defaults?
There's no easy fix here, and this is ignoring the fact that for SOME REASON no two cards perform alike at retail. I still don't understand how the press firmware caused retail cards to crash.

In fact, if someone wants to explain how the cherry picked press cards somehow have a BIOS that has higher clockspeeds, lower voltages, AND that same BIOS causes retail cards to CRASH? How is that AMD not cherry picking and not cheating? Seems pretty obvious at this point. They didn't want people reading reviews to see the real story on factory default performance levels at launch....

This was before the driver fix and why are you ignoring reviews of kepler cards droping up to 20% performance from 1 run to the next. 1% of each other? I'm sorry, but I don't see any reviews testing 3-4 retail ref kepler cards that only drop 1%. 1%? Really? So the GTX680 review cards that were boosting over 1200mhz to reviewers, but no end users got thier reference cards to boost that high. There was a GTX690 in our office for testing a few days ago. The 1st run of metro it was running at 1200mhz, after a few more runs, it was down to 1080mhz. this is with an open case in an airconditioned office. The previouse GTX690 we had would not go below 1150mhz boost in the same condition no matter how long it would run metro for.
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
No, once you realize the fact that the card is noisy and can deal with it, you can then make a decision to purchase it. If you cannot deal with the noise, you will get more throttling, period.

There's an entire massive R290/X review thread which says exactly what you all have already said, many many times already. If noise is an issue, buy NV or other other cards (including AIB designs coming soon). Stop bashing a dead horse.

Allot of people bought the card before it was even reviewed though. Can't just point the finger at them and say you got what you deserved. Loyal AMD customers like those are what deserve better...in that light.

Everyone else who waited? Yeah. You know now. But that still doesn't mean that AMD should just copt out and point the finger at their customers and this is how it is yada yada yada. Only companies with crappy management do that. Good companies listen to their customers and ask what do they want...then deliver it the best they can.
 

Slomo4shO

Senior member
Nov 17, 2008
586
0
71
The 15% performance variance at factory defaults found by Tom's and SKYMTL between press and retail cards?


Are you actually going to continue with the same talking points in every AMD thread or actually contribute something novel? :whiste:

Warning issued for thread crapping.
-- stahlhart
 
Last edited by a moderator:
Status
Not open for further replies.