TDP based turbo and reviews

Arzachel

Senior member
Apr 7, 2011
903
76
91
Note: I'm not saying TDP based turbo is bad, pushing more performance out of the intrinsic variability of semiconductors without overclocking is nice. That said:

I'm guessing most of you here remember the drama around the GTX 460 FTW? With people reporting worse temps/noise/powerdraw in the comment thread? I think we are kinda inching towards that with TDP based turbo.

For the GTX 680, Anything above 1006mhz isn't guarantied, Nvidia is boasting 1058mhz on average and Anandtech got a sample doing 1110mhz or a bin lower - 1093mhz - as far as I understood. Take the issues with overclocking benches and apply it to the whole range: golden chips, YMMV, etc. I don't want to see GPU turbo be downplayed, but the way it's now, reviews end up slightly deceiving and more prone to being influenced by cherry picked cards.

Any ideas on how to improve the reviewing process with such cards? Benchmarking only at the stock speed would be throwing the baby out with the bath water. Reporting base/turbo clockspeeds in the charts would be a start and relatively easy to do and I'd suggest benchmarking several times at different turbo states, but Ryan probably wouldn't be exactly thrilled by that :D
 

KompuKare

Golden Member
Jul 28, 2009
1,211
1,547
136
Reporting base/turbo clockspeeds in the charts would be a start and relatively easy to do and I'd suggest benchmarking several times at different turbo states, but Ryan probably wouldn't be exactly thrilled by that :D

No, I imagine not. Thing is the other alternative would be to retail purchase ten cards and take the average but that's not going to happen either.

Nvidia Turbo is a good feature but I for one do not trust PR people (golden samples to reviewers etc.) plus the amount of fuss the Nvidia fans here make when anyone questions the way Turbo works, I'm beginning to question if it's worth debating Turbo. Intel's Turbo is easy to understand and quantify. Nvidia's is not.
 

tincart

Senior member
Apr 15, 2010
630
1
0
I can't see a good way of accounting for this in benchmarks while still showing off a legitimate feature of the card. So long as reviewers put up the same YMMV warning they do for overclocking in general, I think consumers will at least be reasonably warned.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
If there was a way to disable the turbo (like there is on CPU's for instance) then tests could be run so we could see an actual A-B comparison. The reviewer could then state "This is the speed that you are guaranteed". Then they could run some with turbo and basically state "Here is what you might get if the card you buy has the same TDP as our review unit".

The reason CPU benching does not have to be like this though is because AMD/Intel guarantee a given turbo level. Say it is 2.4GHz default, but can clock one core to 3.7GHz. Everybody is on the same playing field. Thats not the case with the 680, which can be quite different from card to card.
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
If there was a way to disable the turbo (like there is on CPU's for instance) then tests could be run so we could see an actual A-B comparison. The reviewer could then state "This is the speed that you are guaranteed". Then they could run some with turbo and basically state "Here is what you might get if the card you buy has the same TDP as our review unit".

The reason CPU benching does not have to be like this though is because AMD/Intel guarantee a given turbo level. Say it is 2.4GHz default, but can clock one core to 3.7GHz. Everybody is on the same playing field. Thats not the case with the 680, which can be quite different from card to card.

I agree that this would be the best thing to do. The issue is that it doubles the workload on the reviewer if he wants to get both sets of benchmarks out on day one. I guess benching at stock and then treating the turbo as overclocking would work. The way most sites reviewed the GTX 680 is the worst way to go in my opinion.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
For all those asking for disabled turbo. As I stated in another thread, you can set a negative offset (you have to play with the values to find the correct one though) and lock the thing at 1006Mhz under load if you want. It will downclock a bit when not loaded, but under load it will stay 1006. I'm not sure what that would give you though since it's slower than every single stock 680 when you do that.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
If you only go to a single site or two to get a picture of the performance of the GTX 680, then yes this is arguably a problem, however as long as you sample a decent amount of sites, then this variance should drown out in the variance already there due to different test setups.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I think it's over on the GPU Boost situation. We don't ask CPU reviewers to disable turbo boost, why start now?

It's unfortunate we can't get a "static" reading across the board, but disabling the feature is not using the card as intended and underclocking (negative preset) is gimping the card.

Not everyone will get a magical card, and not everyone will get a dud, but for what it's worth your getting a card with a feature that WILL improve your performance.

If this is about being fair in terms of comparing to other cards, you have to realize removing the feature is no longer a fair challenge.
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
I think it's over on the GPU Boost situation. We don't ask CPU reviewers to disable turbo boost, why start now?

It's unfortunate we can't get a "static" reading across the board, but disabling the feature is not using the card as intended and underclocking (negative preset) is gimping the card.

Not everyone will get a magical card, and not everyone will get a dud, but for what it's worth your getting a card with a feature that WILL improve your performance.

If this is about being fair in terms of comparing to other cards, you have to realize removing the feature is no longer a fair challenge.

We do ask them to keep overclocking separate.

Because Turbo Boost is n mhz guarantied on every chip, TDP turbo is not and is luck of the draw. As I said, I don't want to see this feature marginalized, but as of now it's more like overclocking and less like Turbo on CPUs. When a user can get worse performance out of a product than advertised (and keep in mind benches on larger hardware sites is a powerful advertising tool), there is a slight problem.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
We do ask them to keep overclocking separate.

Because Turbo Boost is n mhz guarantied on every chip, TDP turbo is not and is luck of the draw. As I said, I don't want to see this feature marginalized, but as of now it's more like overclocking and less like Turbo on CPUs. When a user can get worse performance out of a product than advertised (and keep in mind benches on larger hardware sites is a powerful advertising tool), there is a slight problem.

I think you are confusing some things - where does nVidia or any of the partners advertise you X-performance? Unless you are implying reviewers are advertising performance then we got a whole different issue - ie different test beds.

This is a feature of the product, you don't ge "worse" from it since there is no intent promise. You are just told it boosts. You could get worse than someone else, but that doesn't affect the product or it's claims.

Again, at this point this boat as sailed, in my opinion, and trying to call for it to be disabled or hindered is asking for using one product inaccurately against another.
 
Last edited:

KompuKare

Golden Member
Jul 28, 2009
1,211
1,547
136
Aside from the reviewers having a lot more work to do, I don't think having results which look like this:

DiRT 3________________________130 ± 2D6
Metro: 2033____________________50 ± 2D6
Starcraft II____________________120 ± 2D6
Battlefield 3____________________60 ± 2D6
Civilization V___________________70 ± 2D6
Crysis: Warhead_________________40 ± 2D6
Total War: Shogun 2____________130 ± 2D6
Batman: Arkham City____________90 ± 2D6
DiRT 3_______________________110 ± 2D6
The Elder Scrolls V: Skyrim______90 ± 2D6
Crysis: Warhead_______________30 ± 2D6

Would catch on. For making fair comparison the best we can hope is that AMD implements something similar. But I'm sure that if Intel had implemented their Turbo like this they'd have gotten a lot of complaints.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Aside from the reviewers having a lot more work to do, I don't think having results which look like this:

DiRT 3________________________130 ± 2D6
Metro: 2033____________________50 ± 2D6
Starcraft II____________________120 ± 2D6
Battlefield 3____________________60 ± 2D6
Civilization V___________________70 ± 2D6
Crysis: Warhead_________________40 ± 2D6
Total War: Shogun 2____________130 ± 2D6
Batman: Arkham City____________90 ± 2D6
DiRT 3_______________________110 ± 2D6
The Elder Scrolls V: Skyrim______90 ± 2D6
Crysis: Warhead_______________30 ± 2D6

Would catch on. For making fair comparison the best we can hope is that AMD implements something similar. But I'm sure that if Intel had implemented their Turbo like this they'd have gotten a lot of complaints.

I beg to ask - from whom? This is essentially a free performance boost built onto the product you bought. Why is anyone complaining? As an AMD user I wish they'd implement. I'd gladly take any feature that increases my card's performance.

As a CPU user I also liked the Intel Boost system. And back then it didn't boost as robustly as it did now.

Who would complain about a performance increase that requires no input from the user, and is garaunteed? Perhaps the guy who doesn't have it. Perhaps.
 

KompuKare

Golden Member
Jul 28, 2009
1,211
1,547
136
Who would complain about a performance increase that requires no input from the user, and is garaunteed? Perhaps the guy who doesn't have it. Perhaps.

It's the guaranteed bit though isn't it. As I've pointed out lots of times, with Intel Turbo Boost the amount of Turbo is guaranteed (unless you're running a 90°C case in which you'll have plenty of other problems), Nvidia Boost is different and the amount of Turbo is not guaranteed.

That's why I had said:

But I'm sure that if Intel had implemented their Turbo like this they'd have gotten a lot of complaints.

The complaints would have been that people though they were getting X but only got Y (X-something). Naturally there would have been no complaints from those people who got Z (Y+something)...

But anyway, this thread is meant to be about in future reviews reviewers can fairly address this issue and IMO they can't.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
It's the guaranteed bit though isn't it. As I've pointed out lots of times, with Intel Turbo Boost the amount of Turbo is guaranteed (unless you're running a 90°C case in which you'll have plenty of other problems), Nvidia Boost is different and the amount of Turbo is not guaranteed.

Again, I think some posters here are confusing some things. The box for my EVGA GTX 680 states it has GPU boost. There is no number on where it lands, it just stats "GPU BOOST." Reviews vary, but are we taking reviews as manufacturer advertisements? All reviews boosted, correct?

I was afraid my card wouldn't boost as well as others because I didn't understand the technology as well, but even if the boost was low compared to other people - it still boosted. I was no deceived in any way.

That's why I had said:


The complaints would have been that people though they were getting X but only got Y (X-something). Naturally there would have been no complaints from those people who got Z (Y+something)...

But anyway, this thread is meant to be about in future reviews reviewers can fairly address this issue and IMO they can't.

Again, this is the where the confusion comes from. No one is telling you what you'll get, they are telling you what the feature does and how it works (on top of that their own results and limitations they found.)

I think the OP is asking for something that really doesn't matter in the grand scheme of things. Users won't disable GPU boost nor set a negative offset - so why should reviewers investigate the card's performance in such situations? The range is already varied, and all these investigations will prove that it in fact does vary.

The feature is new, maybe down the road with more refinement we'll get specified clocks - until them I've yet to see a negative of this feature or any need to not promote it outside of it making my HD 7970 look bad.
 
Last edited:

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
I could see a convoluted hypothetical where someone would avoid buying an Nvidia card based on uncertainty. Something like where you need to have X FPS in a particular application. You know for sure that the base performance of the card cannot meet X FPS unless it uses some boost. However, the problem is that the amount of boost it needs to work is, say, at the higher end where perhaps only 25% of Nvidia cards will be able to meet that performance you need to have (again, this hypothetical assumes a situation where if it is less than X FPS, it simply won't work for your needs). So that buyer will consider whether to roll the dice and buy an Nvidia card where he has 25% chance of winning.

So, I guess such a buyer would instead look for a card where he knew with 100% certainty that the card would deliver X FPS regardless of how much boost he ended up with. I'm thinking maybe someone ordering a bulk order of cards for a lab or group of computers.

Or, that guy would just buy the 690 instead of the 680 heheh everyone wins.

It might also lead to people returning their card and buying another one if the boost is not as good as they hoped, effectively getting another shot at the lottery.

But regardless, I understand the 680 as ideal for gamers, so they don't have such convoluted needs and would be happy to have any amount of boost, as the card would be great even with no boost at all, and any boost is like icing on the cake.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I could see a convoluted hypothetical where someone would avoid buying an Nvidia card based on uncertainty. Something like where you need to have X FPS in a particular application. You know for sure that the base performance of the card cannot meet X FPS unless it uses some boost. However, the problem is that the amount of boost it needs to work is, say, at the higher end where perhaps only 25% of Nvidia cards will be able to meet that performance you need to have (again, this hypothetical assumes a situation where if it is less than X FPS, it simply won't work for your needs). So that buyer will consider whether to roll the dice and buy an Nvidia card where he has 25% chance of winning.

So, I guess such a buyer would instead look for a card where he knew with 100% certainty that the card would deliver X FPS regardless of how much boost he ended up with. I'm thinking maybe someone ordering a bulk order of cards for a lab or group of computers.

Or, that guy would just buy the 690 instead of the 680 heheh everyone wins.

Buyer beware. But, at the same time I always see countless threads/posts about "buy X-card, it OC's to y% on air and destroys z-card."

It just seems like nVidia took what us enthusiast post about and delivered us a product with some form of it haha.

In the end, if a buyer is unsatisfied with their card, I'd say the fault is on them. nvidia didn't promise them X-FPS, and even the test bed itself would harbor variables unless the user had the exact same setup.

I'd agree with people who say reviewers should do the best they can to slap a disclaimer on the reviews, but having them jump through hoops to try to gauge the boost (which will vary per card) is asking for too much in my opinion.

It might also lead to people returning their card and buying another one if the boost is not as good as they hoped, effectively getting another shot at the lottery.

But regardless, I understand the 680 as ideal for gamers, so they don't have such convoluted needs and would be happy to have any amount of boost, as the card would be great even with no boost at all, and any boost is like icing on the cake.

I can tell you right now I felt my card was a dud when I wasn't getting insane boosts on games that didn't really stress the card. I put in BF3 and I got about what everyone was reporting. Had I not tried BF3 on it I might have been one of them guys to try to RMA it and try the lottery game.

In the end, if partners start seeing a lot of this - this could be the black eye on the product, but to me that is more selfish-buyers versus bad product descriptors or misleading. [Again note, I admit to being one of those selfish-buyers.]
 

Elfear

Diamond Member
May 30, 2004
7,128
741
126
I think you are confusing some things - where does nVidia or any of the partners advertise you X-performance? Unless you are implying reviewers are advertising performance then we got a whole different issue - ie different test beds.

This is a feature of the product, you don't ge "worse" from it since there is no intent promise. You are just told it boosts. You could get worse than someone else, but that doesn't affect the product or it's claims.

Again, at this point this boat as sailed, in my opinion, and trying to call for it to be disabled or hindered is asking for using one product inaccurately against another.

I don't think the problem lies with promises made by Nvidia or their partners, it's the fact that it makes card shopping tougher. Buyer looks at reviews on sites A, B, and C. Sites A, B, and C happened to get really good cards (either intentionally or by chance). All three sites show a 680 that is 10% faster than a 7970. Buyer purchases 680 that happens to have a low GPU Boost (unbeknownst to him). In his excitement to show off his new card he invites over friend who has a 7970. Performance is 1% different and buyer is pissed he could have had a card with a girl on the box with the same performance.

I don't see an easy solution to the problem except to either look at all the reviews out there to get a better idea of what an average card would do (hoping the cards sent to reviewers weren't ringers) and/or reviewers should post the average GPU Boost speed for each game they test. This could be easily done by some sort of log from Afterburner or FRAPS.

The proverbial ship has pretty much sailed at this point but I would hope some follow-up reviews would better show what their cards are doing for each game.
 
Feb 19, 2009
10,457
10
76
If there was a way to disable the turbo (like there is on CPU's for instance) then tests could be run so we could see an actual A-B comparison. The reviewer could then state "This is the speed that you are guaranteed". Then they could run some with turbo and basically state "Here is what you might get if the card you buy has the same TDP as our review unit".

The reason CPU benching does not have to be like this though is because AMD/Intel guarantee a given turbo level. Say it is 2.4GHz default, but can clock one core to 3.7GHz. Everybody is on the same playing field. Thats not the case with the 680, which can be quite different from card to card.

This is the point a lot of people miss about the way turbo is implemented on kepler, its different for every card.

It would be fine if review sites state the card's turbo speed, thats all thats required really.
 

superjim

Senior member
Jan 3, 2012
293
3
81
My 680 is a dog, good for +50mhz core OC and that's it. Running at stock I've seen 1084mhz as the highest clock so I suppose I should be thankful it's running above the "standard" boost clock of 1058.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
My 680 is a dog, good for +50mhz core OC and that's it. Running at stock I've seen 1084mhz as the highest clock so I suppose I should be thankful it's running above the "standard" boost clock of 1058.

Double Woof, well glad I got my card and not yours. I think we have it set to +75 on the offset and it hasn't given us any issues (then again she uses it and I doubt she watches the GPU Clock like a hawk.)
 

mak360

Member
Jan 23, 2012
130
0
0
Think we can all agree 100%, cards sent to reviewers were cherry picked but what I find most astonishing is that nVidia couldn’t find 20/25 cards with the same performance (Just look at all review sites), just imagine what buyers cards must be like (the discrepancies between them)
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
I think you are confusing some things - where does nVidia or any of the partners advertise you X-performance? Unless you are implying reviewers are advertising performance then we got a whole different issue - ie different test beds.

This is a feature of the product, you don't ge "worse" from it since there is no intent promise. You are just told it boosts. You could get worse than someone else, but that doesn't affect the product or it's claims.

Again, at this point this boat as sailed, in my opinion, and trying to call for it to be disabled or hindered is asking for using one product inaccurately against another.

I used the term "advertised" loosely, I'm guessing that most of us base our purchasing decisions on benchmarks. While this feature is great for gamers, it also allows reviews to be influenced by the hardware manufacturers thereby making them less objective, which is bad for gamers, or atleast the savvy ones.

I guess the djinn is out of the bottle, but stating the clocks should be something the reviewers do regardless.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
This is the point a lot of people miss about the way turbo is implemented on kepler, its different for every card.

It would be fine if review sites state the card's turbo speed, thats all thats required really.

No, the Turbo is always the same. Even the Voltage for every step is not different.
 

Elfear

Diamond Member
May 30, 2004
7,128
741
126
No, the Turbo is always the same. Even the Voltage for every step is not different.

I think what Silverforce meant was where individual cards fall on the scale. Each speed and voltage step will be based on the same table for all cards but individual cards ability to reach certain steps will vary a lot.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I think what Silverforce meant was where individual cards fall on the scale. Each speed and voltage step will be based on the same table for all cards but individual cards ability to reach certain steps will vary a lot.

When you say "vary a lot" you mean you consider a few MHz "a lot"?

There is about a 2 fps difference between stock and -32% TDP according to Anand, is that "a lot" of fps difference, or not "a lot"?