• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Are retail 290x cards slower than press samples? (Tech Report)

Carfax83

Diamond Member
Tech Report takes a look at some retail samples and compares them to the press samples in terms of overall performance, clock speeds etcetera..

Beyond that, I think we've collected enough data to say with confidence that our initial R9 290X review unit, sample 1, is superior to the two retail cards we tested, regardless of the driver or firmware revision. Even with the blower speed fix in place, our first review unit runs at 5-10% higher clock speeds than the retail cards, depending on the workload. That deficit translates into a 5-10% advantage in frame rates, though usually toward the lower end of that range at 4K resolutions. Sample 1 appears to achieve these clock speeds at lower voltages than the retail cards, too.
 
Wow.

skyrim-clk.gif


I'm sorry, but if anyone tells us that this situation is acceptable they are in the wrong. This level of variance at factory defaults is pretty absurd.
 
Last edited:
This again? Nice sample size too.

We know the coolers suck, and the variance could be due to TIM application compounded by the bad cooler.

Bad TIM on a great cooler will not have nearly the same effect.
 
This again? Nice sample size too.

We know the coolers suck, and the variance could be due to TIM application compounded by the bad cooler.

Bad TIM on a great cooler will not have nearly the same effect.

So we're supposed to give AMD a free pass because their QC is so bad that they "might" have put bad TIM on the card? Give me a break man. SKYMTL at hardwarecanucks is going to have a 10 sample 290X review, and he has already stated that not a SINGLE one of his 290X retail cards perform the same. He has noticed a 15% performance delta between his cards. I don't see what the cooler has to do with it, all of these cards are running at the same fan speed % in CCC. Yet at the same fan speed (as specified in CCC) there is a 15% performance variance. Care to explain that?

Yet, you would tell us that 15% is fine and that AMD "might" have applied TIM improperly on the GPU. How does this matter to the consumer? Is the consumer supposed to apply the TIM their-self? Do you realize how absolutely absurd that sounds?

Fact of the matter is that AMD did something wrong in their design of powertune or something along those lines, because factory defaults shouldn't present such wildly varying performance results. The fact that this is happening is anti-consumer, to say the least - people are buying these cards and not getting the performance they should be getting.
 
Last edited:
It's probably something dumb and preventable like applying too much thermal paste.

AMD's too cheap to pay the 5 cents for paste application experts.
 
I thought this was a bios issue not setting the fan curve properly on the actual retail units? In the other thread it was said that the sample and retail units in fact had 2 different bios versions and that AMD was going to release an update to fix the throttling issues.
 
Yeah, I'm sure AMD is real "puzzled" over the BIOS differences between press and retail card. They know whats up.
 
I thought this was a bios issue not setting the fan curve properly on the actual retail units? In the other thread it was said that the sample and retail units in fact had 2 different bios versions and that AMD was going to release an update to fix the throttling issues.

Techreport tried this and this was the result:

Hmm. With the firmware change, the HIS card's clock speeds look to be up by about 20MHz in this test scenario, but they're still about 20MHz lower than the clocks of 290X sample 1. Could the change be due to some difference in cooler RPM?

Not that I can tell. Heck, the HIS card could simply be faster because of variances in ambient temperature, although frankly, I doubt it. I let the room heat up a bit during the final test run with the press sample's firmware, and the HIS card was still faster than in our first round of tests.

For what it's worth, the alternative firmware didn't alter the HIS 290X's performance much at all. The card averaged about 76 FPS with either firmware revision.

The HIS card did seem to be a little unstable with the press firmware, though. Our test rig locked up several times during Skyrim testing. Could it be that the press sample firmware applies a lower GPU voltage over time? Slightly lower GPU voltages would explain both the higher clock speeds—due to added thermal headroom—and the instability, if the Hawaii GPU on the HIS card isn't quite up to the task of stable operation at those voltage levels.

TL'DR, the press firmware caused instability on the retail bought card.
 
"Could it be that the press sample firmware applies a lower GPU voltage over time?"

Whats with the detective speak.. really, just get a monitoring software and see the voltage. For a review site thats really retarded.
 
Whats with the detective speak.. really, just get a monitoring software and see the voltage. For a review site thats really retarded.



I failed to include:

Turns out the press sample video BIOS runs the HIS card's GPU at about 10-20 fewer mV than the retail firmware. The GPU in the press sample 1 card is obviously a higher quality piece of silicon; it runs at higher frequencies with lower average and median voltages without instability.

What should we make of this seemingly minor voltage delta? Honestly, I don't know. PowerTune is a dynamic algorithm, and it will supply more voltage in order to reach higher clocks if the thermal headroom allows. This 10-20 mV variance could be caused by ambient temperature differences rather than firmware changes. Still, the fact that the HIS card isn't quite stable with the firmware from the press sample makes me wonder.

So they did precisely that. They tested the voltage and the press BIOS applied different voltages, so it isn't a "fix" for this problem. The press cards, at least the ones that techreport had - had higher clockspeeds at lower voltages applied per the firmware. Applying that same firmware on retail bought cards produced crashing.
 
Well the same thing again. After looking at the graphs, I got this conclusion:
- Hawaii reference cooler sucks, as its noisier than 780Ti reference cooler even at 2100RPM vs 2500RPM of 780Ti (which its almost the same needed by Uber mode).
- I dont understand the complaints, nor the reviews. If you dont want the GPU to throttle, then set it to Uber mode.
- Hawaii is almost on par clock for clock with 780Ti which is amazing, custom cooled Hawaii's will rule the world: they will kill everything on earth even 780Ti, and at the same time consume less powa.
- Gonna get that non-reference 290X CF as soon as they hit the market.
 
Wow.

skyrim-clk.gif


I'm sorry, but if anyone tells us that this situation is acceptable they are in the wrong. This level of variance at factory defaults is pretty absurd.

This can be solved by increasing the blower speed up to the 780ti
level , isnt it..??..

skyrim-rpm.gif


Did you notice the HIS blowers speeds.?.
 
This can be solved by increasing the blower speed up to the 780ti
level , isnt it..??..

Did you notice the HIS blowers speeds.?.

It was still significantly louder than the 780ti...

Per the review:

Also, in the "stuff you didn't expect" department, notice that the blower RPM for the GeForce GTX 780 Ti is higher than for any of the 290X cards, even though the 780 Ti is much quieter under load than the R9 290X. Nvidia's blower appears to have a slightly smaller diameter, but I'm impressed that it runs at substantially higher RPM and produces less noise.
 
Last edited:
Man, AMD has really dropped the ball with after-market cards. What are they thinking??

Pretty sure they were hoping nobody would notice. LR reported that AMD still denies a problem exists......so....
The fact that the press firmware had lower voltages and higher clockspeeds, that seems fishy to me. Said firmware caused crashing on retail purchased cards?

*shrug* I'd like to think that AMD wouldn't do that, but that taken at face value seems odd to say the least.
 
Last edited:
Wait.. this is just stupid. "(All of these tests were conducted in the 290X's default fan speed profile, not in "uber" mode.)"

Another useless beatup, Quiet Mode = throttle mode based on temps. Everybody knows it, lower the temp, less throttle.

Run Uber Mode if you dont want throttling or large variances. Yes, its loud. But damn, its fast. Again, whats news here??

What exactly did you expect to get when you put the card under throttle mode based on temps and run a low fan profile? Its going to vary a lot, based on: vcore, ambient temps, TIM application variation. Even the R290, pretty much identical power draw as the R290X, 47% fan speed and it rarely throttles in games.

At 47% fan speed, I cannot say its loud at all.

Your retail card throttles more on quiet mode? Go to Overdrive, drag the slider for fanspeed to the right to 47%, enjoy. Such a ridiculous waste of time, all these review sites testing in quiet mode and shouting "OOH, it varies!!" No duh.
 
Last edited:
Ya, running the card @uber mode, which is like ~2700RPM.

2400 should be enough , that s about 10% higher rpm and theoricaly 10% more flux , this should allow 25W more margin wich will translate to about 90MHz , enough for the card to stick to 1GHZ in this game.
 
Everybody by now knows the R290/X is a loud card, dont buy and use the reference cooler if you cannot stand noise. But to complaint about loss of performance running in Quiet Mode is idiotic because that mode is exactly as advertised, low fan speed, but more throttling due to temps.
 
Man, AMD has really dropped the ball with after-market cards. What are they thinking??

Indeed, why they insist on selling full cards to their partners when they assemble them so shoddily is puzzling. You look at the heatsink THG took of their retail card and see 2 very visible scratches on there D:
Going back to TR's investigation, you get 2fps max difference (excluding the HIS card using 13.11b8) isn't that margin of error stuff?
 
Back
Top