[ht4u.net] 55 DirectX 11 graphics cards in the test

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Cheating? An innovative way for consumers to receive more performance based on TDP and thermals, while having nice acoustics.
 

el etro

Golden Member
Jul 21, 2013
1,581
14
81
Results are hard to believe, once we see most today AAA games favoring nvidia architetures on the majority of tech sites.
 

The Alias

Senior member
Aug 22, 2012
647
58
91
Cheating? An innovative way for consumers to receive more performance based on TDP and thermals, while having nice acoustics.

for high end cards with insufficient cooling (i.e titan, 680, etc) it was cheating because the results you saw in benchmarks were not indicative of real world performance because in real world gaming the cards have a long amount of time to heat under strenuous circumstances and therefore throttle way below the clocks benchmark site samples were hitting using their short tests . So yes it is cheating . because without those five minutes jaunts at 1200 core I doubt the 6xx series would have looked nearly as good on benchmarks and h4tu shows that .
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
Yet it still depends on the cooling configuration. In a properly vented case one could be able to achieve those results. And if not, that's what the temp target slider is for. Cheating would imply that there are only benefits, no downside. The downside of the higher performance is higher power consumption and noise, though. Those three go together - you cannot just look at one and claim it's cheating. There always is a trade-off.
 

The Alias

Senior member
Aug 22, 2012
647
58
91
Yet it still depends on the cooling configuration. In a properly vented case one could be able to achieve those results. And if not, that's what the temp target slider is for. Cheating would imply that there are only benefits, no downside. The downside of the higher performance is higher power consumption and noise, though. Those three go together - you cannot just look at one and claim it's cheating. There always is a trade-off.
then their should be two kepler results : not very ventilated and and well ventilated because those two conditions give very different experiences
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
then their should be two kepler results : not very ventilated and and well ventilated because those two conditions give very different experiences
I think the problem is the time it takes to do benches with the cards warmed up. Never mind doing 2 sets of benches with cards warmed in 2 environments.





This is from Hardware.fr when they tested Titan. The first figure is the performance of the card when cool. The 2nd is after the card has warmed up. The third is with 2x 120mm fans blowing directly onto the card after it's warmed up. The first figure is what we would typically see in a review, not the 2nd, which is what users would typically see. The 3rd is what I would consider a modified or custom cooling system. Pretty big difference! Especially in Anno 2070 where it's 19% faster than what it is in real world conditions.
Hardware.fr said:
Anno 2070: 75 fps -> 63 fps -> 68 fps
Battlefield 3: 115 fps -> 107 fps -> 114 fps



Here are the boost clocks in different games with either the stock setup warmed up or with 2 additional fans blowing on them. I don't see any justification for the sites to have to run additional cooling beyond what is stock for reviews. They don't have to do that with other cards.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
for high end cards with insufficient cooling (i.e titan, 680, etc) it was cheating because the results you saw in benchmarks were not indicative of real world performance because in real world gaming the cards have a long amount of time to heat under strenuous circumstances and therefore throttle way below the clocks benchmark site samples were hitting using their short tests . So yes it is cheating . because without those five minutes jaunts at 1200 core I doubt the 6xx series would have looked nearly as good on benchmarks and h4tu shows that .

The 6XX series didn't have GPU Boost 2.0! One could make the point, that nVidia offered a default GPU Boost 2.0, a cautious, more balanced trade-off of 80 for scaling back.

Gamers have the choice to tweak with tools, have enthusiast cooling flexibility, AIB differentiation -- I just don't see cheating!
I agree with Damien of HardwareFR:

Temperature of 80 ° C covered by Nvidia is relatively low for a GPU and it might have been wise to opt for example 85 ° C and calibrate the fan accordingly.
 
Last edited:

_Rick_

Diamond Member
Apr 20, 2012
3,935
68
91
Gamers have the choice to tweak with tools, have enthusiast cooling flexibility, AIB differentiation

And a reviewer has a product to review.
Much like overclocking, you cannot expect a reviewer to do "overcooling" (in a large scale comparison test).
A good review has to feature a standard environment, that ideally has some similarity to an "average" user setup. When doing a review, you can deviate from that, to show specific limitations, but when doing a cross-product comparison, you cannot do that anymore.

If the manufacturer decides to give you a certain product, then he needs to expect it to be tested in the way he gave it to you. Some people may then go beyond stock, and add cooling, volts or clocks to gain extra performance, but stock performance is the benchmark.

Also note, that this test was done with the latest drivers, and it's not just launch results thrown together. So almost every card was retested to create a level playing field.
I'm sure HT4U will test cards with alternative cooling in the future, and record mean boost clocks for each card they test, before benchmarking. So the horrible blower cooler on the reference card will not ruin results for the entire class of cards with that GPU.

And if you don't see some cheating, when a product is released that performs well for two to three minutes, and then worse from there on, when benchmarks typically run a few minutes, but games usually a least half an hour, then you may want to look a bit closer. Boost may not be intended as benchmark optimization (but instead as smarter TDP and thermal management), but in the end, that's what it is quite blatantly.
 
Feb 19, 2009
10,457
10
76
I don't consider temperature based boost to be "cheating", AS LONG as reviewers point this out. Reviewers often test in an optimal setup, many even have open top bench rigs to quickly switch gpus, that is the PERFECT scenario for NV's GPU boost. Real gaming scenario is completely different. a) its in a case, often not optimally ventilated, b) its constant gaming load exhausts CPU and GPU heat into the case, raising surrounding temperatures.

As a result, good review sites like Hardware.fr who monitor things beyond what other sites often do, shows that Kepler is affected a lot. Hardware sites that DO NOT mention this fact is actually "cheating" its readers by giving them false impressions of performance. Its not funny when the average joe buys a GPU he read in a review, recommended or gold award, and comes home and gets 10-20% less performance.

Again, the onus is on review sites to be more aware and inform their readers. Its nothing to do with NV's implementation, which I think is good to have auto-OC out of the box.

ps. Before any NV fanboy comes in here claiming 10% performance variation is not much, I would like to remind them the difference between a 780 vs Titan is within that margin or less, but its $$$...
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Many, many gamers have their video cards in actual cases and yet do third party reviews actually take the time to place them in cases with nice air flow in their investigations?

Why would an enthusiast card considered to be an average set-up?
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
As a result, good review sites like Hardware.fr who monitor things beyond what other sites often do, shows that Kepler is affected a lot. Hardware sites that DO NOT mention this fact is actually "cheating" its readers by giving them false impressions of performance. Its not funny when the average joe buys a GPU he read in a review, recommended or gold award, and comes home and gets 10-20% less performance.

Average Joe gets more performance because he is able to adjust the temperatur target slider.
 
Feb 19, 2009
10,457
10
76
Considering this generation on 28nm is the first time a GPU vendor has implemented aggressive OC based on thermals..

All review sites should investigate/explain this thoroughly. All they need to include is a few lines to explain the potential for less performance than what the reviewers present. Otherwise they are giving readers false impressions of performance.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Many, many gamers have their video cards in actual cases and yet do third party reviews actually take the time to place them in cases with nice air flow in their investigations?

Why would an enthusiast card considered to be an average set-up?

I'm not sure what you are getting at. It's been shown conclusively that the cards clock down once warmed up. Typical review conditions don't allow this to occur and give us falsely optimistic performance. Therefore, something has to be done differently to give us a reasonably accurate representation of what the card's performance is really going to be.

It's not just nVidia's cards either. Sorry, I don't have the review at my fingertips, but the 7990 throttles when warmed up as well. It also gets noisy when used inside of a case. Yet, if you read the reviews the card had excellent thermal and noise characteristics.

We aren't getting an accurate picture of these cards performance with typical open bench setups only run for a couple of minutes at a time. We need to congratulate the sites that are trying to get around these issues though and give us an accurate review.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Imho,

GPU boost clouds this based on some cores are better than others and a third party review may offer a gauge -- heck even nVidia boost clock is an estimate!

The more innovation with boost, may translates into a smarter TDP and thermal management, which will effect benches based on the core and its gaming environment. It's moving forward with efficiency! How this is cheating is beyond me!

More investigations are welcomed and how boost may react in differing cases, platforms, air flow, cores, time-tables but having the ability to have more performance with a balance of TDP, acoustics and thermals is very welcomed and very smart. More potential work for third parties, one may imagine!
 
Feb 19, 2009
10,457
10
76
Again, its not NV cheating, its a great feature. Kudos to NV for implementing a highly functional and effective feature that auto-OC for the masses.

Its the lazy reviewers who never mention this, very few sites even bother to look into it. They are the ones "cheating" their readers. As a reviewer, the onus is on them to provide the most informative review to the reading public, failing to tell readers that they could be losing 10-20% performance (compared to their results!) just by using it in a case is a pretty big fail.
 

_Rick_

Diamond Member
Apr 20, 2012
3,935
68
91
Its the lazy reviewers who never mention this, very few sites even bother to look into it. They are the ones "cheating" their readers. As a reviewer, the onus is on them to provide the most informative review to the reading public, failing to tell readers that they could be losing 10-20% performance (compared to their results!) just by using it in a case is a pretty big fail.

So you would agree, that in a comparison test, to simulate performance of the card in gaming circumstances, disabling boost and setting the frequency to the average boost clock obtained in dedicated testing is the way to go, if you don't have the time to run every benchmark for 30 minutes, until the card is properly warmed up?

Because otherwise, you won't get a decent comparison, of real world performance.

It's as much cheating, in my opinion, as was/is the deliberate downclocking that happens during Furmark runs. The goal was to stick to TDP, we can assume, but for reviewers that had discovered Furmark as a means to determine actual maximum power use, and used this on the crippled cards, it was benchmark optimization. And any kind of benchmark optimization is despicable, as it means that reviewers have to look for new benchmarks, all the old benchmark results lose their meaning compared to the new generation, and we'll get to the point were manufacturers waste resources that could be spent on making a better product, on making a product that produces better benchmark results. Now, as I was saying, this may not have been the intent, but it has the effect. Maybe they are trying to trick the less capable reviewers, maybe they see it as a way to make sure their cards are silent...

Also, I posit that boost as it is currently implemented, is not ideal. It would be more interesting to limit frame rates intelligently, by downclocking, to save some thermal buffer, so when a particularly intense scene happens, you can briefly exceed TDP to increase min-FPS. But in benchmarks that focus on average FPS, this would probably reduce the score, and thus is not in the interest of the GPU-makers.

Currently, from what I understand, it's more of the opposite. But then, we have little analysis of what kind of scene actually uses "more" of a GPU than average, and hence reduces boost, by increasing power use. All we do know, is that if you keep it cool (or ignore the temperature recommendation), it will run faster, unless it's TDP limited.


For SSDs, Anandtech has finally started to do a little bit of a differential analysis for access times, and for FPS at least a Tukey-plot would allow a much better characterization. We can only hope, that with frame-time analysis, statistical results that offer more than a single value will become more commonplace.
 

The Alias

Senior member
Aug 22, 2012
647
58
91
The 6XX series didn't have GPU Boost 2.0! One could make the point, that nVidia offered a default GPU Boost 2.0, a cautious, more balanced trade-off of 80 for scaling back.

Gamers have the choice to tweak with tools, have enthusiast cooling flexibility, AIB differentiation -- I just don't see cheating!
I agree with Damien of HardwareFR:
I forgot what exactly the original 6xx series' boost hinged on, but the same thing happens with them where they clock high initially, but drop their clocks over time . A lot of forums recorded this phenomena .
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Also, I posit that boost as it is currently implemented, is not ideal.

What truly is ideal with technology when there are many trade-offs to consider over-all? Some allow idealism to be the enemy of good!

One has a problem with its current iteration; its default setting; is there flexibility to tweak it for an individual's subjective need, tastes, threshold and tolerance level?

Apples-to-apples is tougher based on differentiation with cores and gaming environments but would be happy to see more investigations!
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I believe GPU Boost 1.0 starts to scale down the clocks after 70C and it does it in 12mhz (or is it 11mhz?) increments.

Unless you are staring at an FPS meter or clocks meter, I'd assume you wouldn't notice unless you hit a critical temp but by them I'd be more concerned in your cooling setup than Nvidia cheating on benches.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
I believe GPU Boost 1.0 starts to scale down the clocks after 70C and it does it in 12mhz (or is it 11mhz?) increments.

Unless you are staring at an FPS meter or clocks meter, I'd assume you wouldn't notice unless you hit a critical temp but by them I'd be more concerned in your cooling setup than Nvidia cheating on benches.

Rail I believe it's 75C and 13Mhz increments.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I think for reviews Kepler should be forced to run at the base clock, the min clock it will run at.

Yes this won't help nVidia's case, and it won't represent what customers will get out of the box but it will create a baseline that is easily seen.

That said, I think reviewers need to do 3 tests with Kepler. Base Clock performance -> Boost 2.0 performance increase -> Overclocked performance increase.

None of this really changes anything for users here who overclock, Kepler (GK110) is still nearly a generation ahead of AMD once both are overclocked.