[techreport]AMD issues statement on R9 290X speed variability, press samples

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
This was before the driver fix and why are you ignoring reviews of kepler cards droping up to 20% performance from 1 run to the next. 1% of each other? I'm sorry, but I don't see any reviews testing 3-4 retail ref kepler cards that only drop 1%. 1%? Really? So the GTX680 review cards that were boosting over 1200mhz to reviewers, but no end users got thier reference cards to boost that high. There was a GTX690 in our office for testing a few days ago. The 1st run of metro it was running at 1200mhz, after a few more runs, it was down to 1080mhz. this is with an open case in an airconditioned office. The previouse GTX690 we had would not go below 1150mhz boost in the same condition no matter how long it would run metro for.

All of those supposed GTX 690 clockspeeds you mention are impossible because they are done in specific 13mhz increments. 1200 is not a possible clockspeed. Neither is 1080. You're estimating, though, right? Any valid explanation for clockspeeds that aren't possible on the Kepler?

Here's a fun fact: what's the advertised boost speed of the GTX 690? I'll let you figure this out and get back to me. (here's a hint: far lower than any clockspeed you mention) The 290X struggles to even hit it's boost at factory defaults, the Kepler GPUs far exceed their advertised boost at factory defaults. You're backing up that point with the GTX 690 even if you're throwing out clockspeeds that are not possible with the Kepler.
 
Last edited:

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Your "advertised boost" point has been noted...many times. It's just a smokescreen for the real issue which is these cards are not performing at their review performance.

The sad thing is, the 290X "uber" mode is probably the closest any card gets to it's actual performance during review. It maintains its clock speeds and performance far better than Titan does, or any reference GeForce that hits 80C easily in any normal case.
 

realibrad

Lifer
Oct 18, 2013
12,337
898
126
This all seems silly.

If I have a card that goes from 300Mhz to 1GHZ every 2 seconds, but the game play is great, vs a card that does 9GHZ but plays worse, then I wold get the first card. If you pay for a 290x and you think that another 290x is better, return the former. AMD did not force you to by the card.

This idea that it would be better to cap all the cards because its unfair that someone might get a card that is worse is stupid. If you get a card that you feel is not up to snuff then return it. If you feel like the value of the card is not worth the money, dont buy it.

NV had been using lower clocks on their cards vs AMD. They for a while had clock per clock been better. If you are buying a card for epeen then worry about clocks. If you want a card to play games, then find one that fits your budget.
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
This all seems silly.

If I have a card that goes from 300Mhz to 1GHZ every 2 seconds, but the game play is great, vs a card that does 9GHZ but plays worse, then I wold get the first card. If you pay for a 290x and you think that another 290x is better, return the former. AMD did not force you to by the card.

This idea that it would be better to cap all the cards because its unfair that someone might get a card that is worse is stupid. If you get a card that you feel is not up to snuff then return it. If you feel like the value of the card is not worth the money, dont buy it.

NV had been using lower clocks on their cards vs AMD. They for a while had clock per clock been better. If you are buying a card for epeen then worry about clocks. If you want a card to play games, then find one that fits your budget.

This isn't remotely close to what the discussion is about.

This is about review cards having (most of the time) better performance compared to retail cards. People read reviews and decide on purchases based on these reviews, and if they get a product that doesn't perform per the reviews that is a problem.

Which is why I suggested benchmarking cards at their base clocks for reviews (in the other thread). If the card can't perform at their base clocks, that is a cause for RMA. Looks like the 290X doesn't have a base clock specified, although numbers are being thrown around such as 676MHz and 727MHz.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
All of those supposed GTX 690 clockspeeds you mention are impossible because they are done in specific 13mhz increments. 1200 is not a possible clockspeed. Neither is 1080. You're estimating, though, right? Any valid explanation for clockspeeds that aren't possible on the Kepler?

Here's a fun fact: what's the advertised boost speed of the GTX 690? I'll let you figure this out and get back to me. (here's a hint: far lower than any clockspeed you mention) The 290X struggles to even hit it's boost at factory defaults, the Kepler GPUs far exceed their advertised boost at factory defaults. You're backing up that point with the GTX 690 even if you're throwing out clockspeeds that are not possible with the Kepler.

Well I'm sorry I didn't write down the exact clock speeds when I had the cards. Go ahead and be pedantic I rounded the numbers off. Clock speeds is not the issue here. It's how much variation the is with different cards. you are claiming kepler cards have a variation of 1%. Thats just not true and I don't know how you believe that.

So if the boost clock is 1000mhz and the review cards are boosting to 1.5ghz, but retail cards care boosting the 1100mhz. Thats fine in your book cause they are boosting higher than the numbers on the box even though the card is a lot slower than the review cards?

But if the AMD card has a boost clock of 1000mhz and the review cards run a 900mhz and the retail cards run at 850mhz. That not ok, because they are below the max boost clock?
Is that the problem here?

Not that review cards are faster than retail ones, which is possible with both hawaii and kepler? but because the numbers on the screen don't match the ones on the box?

I still don't see this 15% performance drop. You keep bringing graphs that show the 290 droping 300mhz, but I don't see any actual gameplay performance detriment because it looks to me like all the reviewers are warming up the cards.

While you were fussing over the clock speeds i listed. You missed my point where i stated that Kepler has JUST as much variation as hawaii and there are reviews that back this up. That variation being about 5%. But you keep pulling this 15% from somewhere. If you are going to say it is from 1 run to the next. Kepler behaves exactly the same way and you know it. It is mentioned everywhere. It's starts at max boost and drops as the cards heats up. As much as 20% as shown. GK110 is just as bad. As it's seen by adding fans blowing cool air over the cards alow it to sustain boost longer and higher. Same as Hawaii.

http://www.hardwarecanucks.com/foru...roundup-asus-evga-gigabyte-galaxy-msi-21.html

I urge you to read this review. The Asus card should be beating the MSI and Gigabyte cards, but it isn't. Why is that? because of variation. That is not 1%.

There are plenty of other European sites that have done this type on testing on kepler. I don't know if you've just missed them all or avoided them.

The only thing you can blame AMD for here is the terrible cooler.

EDIT: Wait a minute, I don't even know what exactly the problem. Do you have a problem with card to card variance or do you have a problem with the cards not running at 1ghz all the time or do you have a problem with press cards being faster than retail cards?
 
Last edited:

DiogoDX

Senior member
Oct 11, 2012
746
277
136
AMD is so naive. It was obvious that many sites would use quiet mode to compare the performance and then would use the uber mode to compare the noise following nvidia orientations.

They just make too easy to increase the bad press. If they lauch with uber mode only the heat and noise would be the complains but now in addition to these factors we have the throttling of the quiet mode. We shall see in the next generation if they have learned from their mistakes.
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
My QuadFire 290x don't make noise at all and they reach 46'C max on stock. I bought reference 290x to watercool them, same thing I did with the reference GTX 780 when they released. I didn't even test the GTX 780 on stock cooler and slapped a waterblock right away.

Is it not redundant to talk about the stock cooler?

Are you not aware that custom coolers and custom PCBs are coming?

I thought this topic has been discussed for over one month?

I feel like I'm reading an October thread.
 

el etro

Golden Member
Jul 21, 2013
1,581
14
81
AMD is so naive. It was obvious that many sites would use quiet mode to compare the performance and then would use the uber mode to compare the noise following nvidia orientations.

They just make too easy to increase the bad press. If they lauch with uber mode only the heat and noise would be the complains but now in addition to these factors we have the throttling of the quiet mode. We shall see in the next generation if they have learned from their mistakes.

And how we know about this "partiality"! |We're seeing this for so many years...

Well, there's three weeks to the end of all this history...

AMDs could delayed a month more the card release to avoid this negative repercussion.

My QuadFire 290x don't make noise at all and they reach 46'C max on stock....

Carlitos, they are talking about the hypothesis of AMD sending golden samples to reviewers.
 

ams23

Senior member
Feb 18, 2013
907
0
0
@ Karlitos: Good for you that you invested >$2k in video cards for a custom rig, but what did you expect? Only reference designs are currently available, custom AIB designs are not yet available. And many people are not comfortable going with more heavily customized water cooled setups.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
My QuadFire 290x don't make noise at all and they reach 46'C max on stock. I bought reference 290x to watercool them

I thought this topic has been discussed for over one month?

I feel like I'm reading an October thread.

No. It says December.

AMD issues statement on R9 290X speed variability, press samples
December 5, 2013


It's good to see a company acknowledging an issue/investigating, instead of playing deaf, dumb & blind.
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
@ Karlitos: Good for you that you invested >$2k in video cards for a custom rig, but what did you expect? Only reference designs are currently available, custom AIB designs are not yet available. And many people are not comfortable going with more heavily customized water cooled setups.

All I'm saying is everyone are informed that custom coolers are coming. Just wait for them if you want less noise and lower temperature. Its simple. No?

Or just get a GTX 780ti or non-ti... awsome cards with good coolers.


No. It says December.

AMD issues statement on R9 290X speed variability, press samples
December 5, 2013


It's good to see a company acknowledging an issue/investigating, instead of playing deaf, dumb & blind.

I agree with you mate. :thumbsup:


Karlitos, they are talking about the hypothesis of AMD sending golden samples to reviewers.

I don't really believe this. I have hard time believing that a company can do this in 2013.

How can sending golden samples to reviewers before release be good for their future sales?
 
Last edited:

Atreidin

Senior member
Mar 31, 2011
464
27
86
I wouldn't be surprised if it was just QC issues. Maybe the press samples had someone making sure they were assembled well with no huge scratches on the heatsink surface that touches the die, and good application of thermal grease. Then when the companies went into mass-production mode they just targeted "good enough" quality levels to lower costs, for however they defined "good enough". That would explain easily without any conspiracy or ill-intent, just basic economic factors driving production quality. I usually favor the less dramatic explanations, I get enough drama elsewhere.
 
Last edited:

ams23

Senior member
Feb 18, 2013
907
0
0
@ Karlitos: Maybe AMD thought that no one would notice if some retail ref. cards performed significantly worse than press sample ref. cards?

Note that AMD keeps changing it's tune on this retail vs press sample variance issue. First they insisted that the retail cards were faulty or broken. That was proven incorrect when variance was found by multiple sources in cards that were working fine but just not performing to the same level as press samples. Then they said that the issue was fan related and would be fixed by a driver. That proved incorrect because there is still significant variance after the driver update (and in fact, noise levels have gone up with the new driver because the fan is spinning faster) . Now they are back to investigating again.
 
Last edited:

UaVaj

Golden Member
Nov 16, 2012
1,546
0
76
amd is coming clean.

as with any overclock silicon. your mileage will vary.

amd should have sent the least-common-denominator performance for the initial press release.

then we all would be praising amd on how well 290x overclocks and not be complaining about the loud cooler.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
How can sending golden samples to reviewers before release be good for their future sales?

So what's the alternative explanation? Two websites received press cards with a BIOS that resulted in higher clockspeeds and lower voltages. Techreport flashed a retail sample with this BIOS, and ..... the retail card began to demonstrate instability.

So what's the rational explanation of what's going on? As far as bad press being good for future sales, this is only a drop in the bucket compared to the full bad press the 290 and 290X has been receiving. I guess AMD is lucky that there is a large contingent of users who don't play PC games and will never play PC games buying their cards. (ie miners), otherwise the damage could be far worse. Personally, most of my buds fall into the category of using only nvidia for actual PC gaming, but use AMD on out of sight rigs for mining. I'm sure there are some PC gamers using AMD (obviously) but the bad press related to not only this issue, but others - i'm sure has held a large contingent of actual gamers back from buying. Yet the miners won't really care, so, there's that.

Like I said, though, I can't think of any possible explanation of why the press BIOS apparently performs better than retail, and said BIOS caused a retail card to crash. I can't think of any alternative theory other than the press cards were somehow fundamentally different than what is found in etailer stock...
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
@ Karlitos: Maybe AMD thought that no one would notice if some retail ref. cards performed significantly worse than press sample ref. cards?

Note that AMD keeps changing it's tune on this retail vs press sample variance issue. First they insisted that the retail cards were faulty or broken. That was proven incorrect when variance was found by multiple sources in cards that were working fine but just not performing to the same level as press samples. Then they said that the issue was fan related and would be fixed by a driver. That proved incorrect because there is still significant variance after the driver update (and in fact, noise levels have gone up with the new driver because the fan is spinning faster) . Now they are back to investigating again.

There wasn't significant variance after the fan update. It was 5%, which is what AMD told everyone to expect.
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
@ Karlitos: Maybe AMD thought that no one would notice if some retail ref. cards performed significantly worse than press sample ref. cards?

Hahahaha, you are so funny bringing some silly hypothesis.
"Maybe AMD thought that no would would notice"

Just wow. :thumbsdown:

Note that AMD keeps changing it's tune on this retail vs press sample variance issue. First they insisted that the retail cards were faulty or broken. That was proven incorrect when variance was found by multiple sources in cards that were working fine but just not performing to the same level as press samples. Then they said that the issue was fan related and would be fixed by a driver. That proved incorrect because there is still significant variance after the driver update (and in fact, noise levels have gone up with the new driver because the fan is spinning faster) . Now they are back to investigating again.

Do you have some straight facts that press samples were better?
What drivers are you referring to?
What are you experience with the 290x?
Are you basing all you are saying on craps you read on the net?
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
So Karlitos. Aside from your attack on AMS there, what are you saying? That all of these websites are lying? Do we have straight facts that press samples were better *blinks* really?

Apparently, we do have straight facts that press samples are performing better than retail. Toms, PCper, hardwarecanucks (4 samples tested thus far), techreport. I dunno, 4 websites all reporting press cards performing better than retail bought cards. I guess they're all lying? Is that what you're suggesting? So despite all of these websites finding the same thing, they're all being dishonest? Not being facetious - serious question, I don't even know where you're headed here with that post. That does seem to be what you're suggesting if i'm understanding your post, so please clarify. Is everyone lying?
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
So Karlitos. Aside from your attack on AMS there, what are you saying? That all of these websites are lying? Do we have straight facts that press samples were better *blinks* really?

Apparently, we do have straight facts that press samples are performing better than retail. Toms, PCper, hardwarecanucks (4 samples tested thus far), techreport. I dunno, 4 websites all reporting press cards performing better than retail bought cards. I guess they're all lying? Is that what you're suggesting? So despite all of these websites finding the same thing, they're all being dishonest? Not being facetious - serious question, I don't even know where you're headed here with that post. That does seem to be what you're suggesting if i'm understanding your post, so please clarify. Is everyone lying?


What attacks? Are you serious?

Please instead of mentionning 4 web sites, can you link them.
 

ams23

Senior member
Feb 18, 2013
907
0
0
@ Karlitos: this is the same company that doesn't even have the decency to list a guaranteed base clock operating frequency on R9 290[X] cards! If they lack transparency there, then they surely won't have qualms with lack of transparency in other areas.

The fault here lays with AMD, period. If they had advertised or even specified a guaranteed base clock operating frequency, and both retail and press sample cards could reach that frequency, then we wouldn't be having this discussion.
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
So Karlitos. Aside from your attack on AMS there, what are you saying? That all of these websites are lying? Do we have straight facts that press samples were better *blinks* really?

Apparently, we do have straight facts that press samples are performing better than retail. Toms, PCper, hardwarecanucks (4 samples tested thus far), techreport. I dunno, 4 websites all reporting press cards performing better than retail bought cards. I guess they're all lying? Is that what you're suggesting? So despite all of these websites finding the same thing, they're all being dishonest? Not being facetious - serious question, I don't even know where you're headed here with that post. That does seem to be what you're suggesting if i'm understanding your post, so please clarify. Is everyone lying?

These being the same sites that don't test Nvidia retail cards vs press samples? How come you are asking them to do the same for 780Ti?
 
Status
Not open for further replies.