• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Are retail 290x cards slower than press samples? (Tech Report)

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Lets benchmark all cards at their stock clocks without boost.

Problem solved.

:thumbsup:

Boost is simply abused for cheating now as we have seen especially with the 290/290X presscards. Samsung and HTC also got delisted not long ago due to...boost. And I bet there will be many more.
 
Last edited:
Lets benchmark all cards at their stock clocks without boost.

Problem solved.

I'm all for that.
That is, including benchmarks done at guaranteed clocks("base" per NV). So we get the baseline performance.

But AMD does not even have base clock in Hawaii. WHY?
They would have to lower clock considerably and come up with better PowerTune implementation,
OR take a hit with already limited Hawaii supply amirite, ie. YIELDS.
You guys can think of any other reason?
 
Mine requires 64-70% fan to maintain stock boost levels @ 70-72 F ambient.

They obviously lied insanely hard.

I'll leave this here for posterity. Before it gets buried instantly by astroturfing.

Note: No one was gracious enough to pay me for either my testing or the cards.
Wish they did though, AMD/Nvidia/Intel give me moneh plix :O
 
Last edited:
Samsung and HTC also got delisted not long ago due to...boost.

It wasn't because of boost per se, it was because they had a boost feature enabled to work on only a small whitelist of benchmark apps. This is considered cheating and rightly so, since the performance seen under these circumstances won't reflect actual performance in any real-world application. If they had implemented boost the way the GPU vendors do (based on TDP and thermals) there would be no problem.
 
It wasn't because of boost per se, it was because they had a boost feature enabled to work on only a small whitelist of benchmark apps. This is considered cheating and rightly so, since the performance seen under these circumstances won't reflect actual performance in any real-world application. If they had implemented boost the way the GPU vendors do (based on TDP and thermals) there would be no problem.

Then they could still cheat in press samples 😉

The main problem is, the bigger base/boost delta, the bigger area you open for either variance or/and cheat. And we know that companies do exploit this.
 
Last edited:
Then they could still cheat in press samples 😉

The main problem is, the bigger base/boost delta, the bigger area you open for either variance or/and cheat. And we know that companies do exploit this.

Neither Nvidia, nor AMD, nor Intel are cheating. I welcome boost. It means GPUs have the potential to perform better. GK110 would be a lot slower if it were capped at 850mhz and wasn't allowed to go over that. So you could cool it as much as you wanted, but the only way to get better performance would be to void your warranty and OC. With boost, you have a case with good airflow? Hey here is some extra performance to reward you. You don't mind noise? Here is some more performance to reward you and you warranty is still intact.

Your application only uses 1 thread? Well here, lets make that 1 thread go as fast as possible.

Why would that be bad? Whatever better is you can customize powertune and boost. With powertune.

Yes, I do wish it were more like Intel's where you could disable boost all together for people who don't want it. But other than that. why would you think it's bad.

The companies aren't cheating. Reviewers need to make their benchmarking more like real world usage. Put the card in a standard case. A HAF or an obsidian or whatever. Close it up. Test games for more than 5mins. An hour per game would be good. So what if it takes them longer. AT are usually last with their phone reviews, but people read them because they are usually the best and most indepth reviews around.
 
Cherry picked press cards with a better performing custom BIOS. A BIOS never intended for use in retail cards. I'm not sure how that isn't cheating.
 
Cherry picked press cards with a better performing custom BIOS. A BIOS never intended for use in retail cards. I'm not sure how that isn't cheating.

There is still as of yet no acceptable explanation of this. AMD stated they "can't replicate the problem" even though the press BIOS caused retail cards to crash, and the press BIOS used lower voltages/higher clockspeeds.
 
There is still as of yet no acceptable explanation of this. AMD stated they "can't replicate the problem" even though the press BIOS caused retail cards to crash, and the press BIOS used lower voltages/higher clockspeeds.

Any links to all these retail cards crashing when flashed with press bios?
I've only seen TR's
 
Any links to all these retail cards crashing when flashed with press bios?
I've only seen TR's

Legit reviews card didn't crash when flashed with press bios.

Retail cards probably run overall higher voltage than press cards cause there is probably more variation in chip quality with retail samples so they have higher voltage so that all cards are stable without having to bin every single chip that comes out of the fab.
 
Legit reviews card didn't crash when flashed with press bios.

Retail cards probably run overall higher voltage than press cards cause there is probably more variation in chip quality with retail samples so they have higher voltage so that all cards are stable without having to bin every single chip that comes out of the fab.

Review cards should not differ from retail cards at all. Not one bit. Otherwise, what is the point of reviews?
 
Legit reviews card didn't crash when flashed with press bios.

Retail cards probably run overall higher voltage than press cards cause there is probably more variation in chip quality with retail samples so they have higher voltage so that all cards are stable without having to bin every single chip that comes out of the fab.

The bottom line is that there is a difference between press cards and retail cards when none should exist. That applies to any product being reviewed. What AMD did deserves deeper analysis. I'd have the same position if NVidia did the same thing. I don't want to be buying a product under false pretenses. No matter how small a variance it may be.
 
I'd have the same position if NVidia did the same thing. I don't want to be buying a product under false pretenses. No matter how small a variance it may be.

Nvidia has been doing this since Kepler launch. Retail cards didn't boost as much as press ones.
 
Except there is variance with cards from both AMD and Nvidia. Review cards kepler cards have also seen to have been boosting quite a bit higher than retail cards. Some retail kepler cards are slower than other retail cards that have a lower boost clock and all of that is from variance. So you are in the same boat with either nvidia or AMD.

Either you look at the low end of the variance and use that to buy your product or game on intels GPUs.

Not to mention TR's 1st review sample was faster than the second. The 2nd review sample was actually the same performance as the retail cards. Take that how you will.
 
Last edited:
this is old old old news.
when you heavy overclock (aka boost) anything. your mileage will vary. as proven here. some silicon are simply better than others.

Firebird hit the nail on the head hard.
benchmark all cards at their lowest stable clocks (stock) without boost.
performance should be level across the board.


any one else want to continue to play this 290x silicon lottery?
 
To conclude:
- Kepler gpus throttle as bad as 290s after warmup.

That is not really true. A GPU such as GTX 780 Ti ref. is able to maintain much more stable GPU clock operating frequencies compared to R9 290X ref. especially when in the default "Quiet" mode (see PCPer R9 290X review, and retail vs. press sample investigation on R9 290X).

The main issue most reviewers have with R9 290X (other than noise and heat) is that the GPU is almost never able to maintain advertised clock speeds in the default [Quiet] mode in particular, and in most cases the clock speeds fall well short of advertised in this mode (up to 20-30% lower than advertised). Kepler GPU's, on the other hand, are GUARANTEED to hit their advertised base clock operating speeds. So if there is anyone to fault for this, it is AMD's marketing team for the lack of transparency and failure (and even downright refusal) to list a guaranteed base clock operating speed.
 
Last edited:
Kepler GPU's, on the other hand, are GUARANTEED to hit their advertised base clock operating speeds.

That's great. Where can I find reviews of Kepler GPUs running at their guaranteed advertised base clock rate? Everything I've seen has them running at their max variable boost clock speed, just like AMD cards.
 
That's great. Where can I find reviews of Kepler GPUs running at their guaranteed advertised base clock rate? Everything I've seen has them running at their max variable boost clock speed, just like AMD cards.

just look at any Kepler specs on any website, rather that be amazon, newegg, techpowerup, anandtech or wherever. it will bluntly tell you the guaranteed base clock.

for example. 680 is guaranteed at 1006MHz. will oc to 1150MHz. if you really lucky with the silicon lottery, then to 1225MHz.

any gpu that does not meet that guarantee base clock simply gets a rma as defective.

any more questions?
 
just look at any Kepler specs on any website, rather that be amazon, newegg, techpowerup, anandtech or wherever. it will bluntly tell you the guaranteed base clock.

for example. 680 is guaranteed at 1006MHz. will oc to 1150MHz. if you really lucky with the silicon lottery, then to 1225MHz.

any gpu that does not meet that guarantee base clock simply gets a rma as defective.

any more questions?

I believe the point he is making is that advertised clock speeds matter very little in the whole scheme of things. Most people base their purchases on review results. The vast majority (if not all) reviews show results from 680's that are not running at base clocks (1006Mhz) or advertised boost (1058Mhz).

When people purchase a 680 they are basing their expectations on reviews and not advertised speeds, afterall, how would they even know what performance looked like at 1006-1058Mhz since none of the reviews run at those speeds.

Same with Hawaii. People base their decisions to buy on reviews and not an undisclosed base clock. For the review sites that properly warm their cards up first it gives buyers a correct view of performance in Quiet and Uber modes. Some reviews saw clocks drop into the 750-850Mhz range in Quiet mode and that was reflected in the performance.

In regards to the thread title, if AMD really sent golden review samples out then I hope it comes back to bite them. That's a poor practice that doesn't give consumers an accurate picture of real-life performance. Hopefully it turns out to be something else and gets resolved quickly.
 
With all due respect, I would strongly disagree with anyone trying to paint Hawaii and GK110 in the same light in terms of performance over time. There are two very different situations here: Hawaii is throttling by 200mhz in quiet mode even in systems that aren't overly excessively hot (really, 33C isn't unreasonable IMO) while Kepler throttles by 1-2 bins on average with reference cards. Aftermarket Kepler cards generally DON'T throttle unless you use over-voltage. I know that in my personal use of the EVGA SC ACX 780, at factory defaults it does not throttle on auto fan settings. My prior MSI lightning GTX 680 cards also did not throttle, and my prior 7970s didn't throttle either (I really liked the 7970s..)

Case in point:

GTX-660-ROUNDUP-82.jpg


Here's the GK104 which throttles over a period of 6 minutes by 2 bins at maximum. 2 bins = 26mhz, and this matches my experience of using both the GK104 and GK110 - never in my personal use has any of these cards throttled by more than 2 bins on average with a reference shroud @ auto fan settings. With aftermarket cards, throttling is eliminated altogether. It should be noted that this chart contains AFTERMARKET cards which of course will be clocked higher, but generally speaking reference cards will have similar clocks to other reference cards. Aftermarket cards will vary in terms of boost depending on how aggressively binned they are.

So here's my point - GK104 and GK110 are offering consistent performance over time by throttling very little even at factory defaults using auto fan. In this chart, they are throttling by 13-26mhz which matches my own experience with using Kepler GPUs. Do note. These are all using factory default auto fan settings.

Conversely, the situation with Hawaii is very different. These cards are throttling by 100-200MHz in quiet mode which is of course, absolutely absurd - and that leads to the inconsistent performance that many websites have taken note of. As well, this accounts for some of the press vs retail variance - for some reason the retail cards seem to require higher voltages (per techreport) and that of course leads them to throttle at earlier times.

kombustor-clk-s1.gif


I also hope that AMD didn't cherry pick cards. But really. The press BIOS' had lower voltages and higher clockspeeds. I don't know what to make of that but I know what the logical conclusion is, even though i'll keep that to myself.

Of course, some will say that uber mode fixes it. For most people, it will. Yet, here's the problem: people who are gamers (ie non miners) buy the cards based on quiet mode benchmarks appearing in web reviews. So when they see the press samples performing pretty well in quiet mode, they might get a card that throttles by 100 more mhz than the press sample. That will of course lead to lower performance than what AMD represented to them with a press BIOS that has lower voltages and higher clockspeeds. Their retail card? Lower clockspeeds and higher voltages. That's the problem. Consistency.
 
Last edited:
Aftermarket cards have always had higher clockspeeds than reference designs, so what you're stating is not exactly a revelation. There were aftermarket 7970 cards with 1100mhz clockspeeds while the reference vanilla design was 925MHz. Aftermarket cards outperform reference cards - aftermarket cards are generally more aggressively binned and specifically designed to support higher clockspeeds. So of COURSE they will perform higher and have higher clockspeeds.

The point here is, that GK104/110 are offering consistent performance over time at factory defaults and auto fan settings, with minimal throttling of 1-2 bins. What that translates into is that press cards and retail cards are performing similarly in terms of consistency: Hawaii should offer the same. IMO. How can any justification be made for retail cards underperforming and throttling more than what a web review (with press cards) indicated to them? I don't really understand it.
 
Reference was ok for 7000 series Radeons which weren't pushed as hard out of the box, but let me tell you about my experience with radeon reference coolers: out of the box, my reference 7970 was dust free but after half a year it was dusty and gained 7 degrees C at load. This is with a Corsair 500R case with custom-fitted filters and regular cleaning, except that I didn't open the 7970 reference cooler due to not wanting to void my warranty.

If a R9 290/290X reference cooler follows a similar path, then over time a reference 290/290X may throttle due to the dust causing temperatures to rise and thus reducing the headroom available to boost into. Brand new reference 290/290X's are already close to throttling in some cases (no pun intended).

The bottom line is: do not buy the reference 290 or 290X, wait for aftermarket coolers or else do it yourself with aftermarket coolers/water cooling. Just imho.
 
Last edited:
I've been a computer gamer almost six years now, and there have been allegations of "company X gave review sites golden samples!" since I can remember.

The companies could solve this very easily if they wanted to.

When my dad has to buy things for work, he submits an expense report and they reimburse him.

The companies like AMD and NVIDIA could reimburse review sites for purchases of cards, and we could wait a couple weeks for reviews of cards. If the sites were using normal channels to obtain cards there would be no possibility of doctored reviews.

In the meantime, it's good sites are doing these comparisons.

I'm waiting on aftermarket cooler 290 vanilla, would like to see how they "really" perform before I buy. $400 is a lot of money to me.
 
Back
Top