I have a problem with Nvidia GPU Boost.

Cythreill

Member
Apr 6, 2011
31
0
0
Please correct me if I am wrong.

Nvidia GPU Boost seems to give you a dynamic overclock when you LEAST need it.

For example: I'm playing a game where the AVG. FPS is 60, the card will struggle in some scenes (30 FPS) and thrive in others (90 FPS), it seems that the dynamic overclock will activate in those less stressful scenes, right?

The thing is, I'm not really going to benefit of an increase from 90 to 95 FPS, I need that performance increase when I notice the game is playing less smoothly, NOT when it is far beyond an acceptable frame rate.

All it seems GPU Boost will do is mislead me when sites publish Benchmarks as GPU Boost is artificially increasing the average frame rate by increasing the upper bound but not doing anything to fix scenes where the game is at 35~ FPS. The GPU with the lower AVG FPS could in fact be smoother as it could be more consistently playing around it's average.

eg. I could think one GPU is better than another just because the AVG. FPS is 3-10 FPS higher, but really, the average is being pushed up by a higher maximum FPS which does contribute to smooth gameplay.
 

aaksheytalwar

Diamond Member
Feb 17, 2012
3,389
0
76
I have no idea whether this is truly the case or not, but if it is then it might be a nvidia strategy to make the launch happen earlier than schedule. But even as far as minimum FPS are concerned, the difference according to benchmarks are not very different from a 7970. So I don't know if this should be the case or not.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Please correct me if I am wrong.

Nvidia GPU Boost seems to give you a dynamic overclock when you LEAST need it.

For example: I'm playing a game where the AVG. FPS is 60, the card will struggle in some scenes (30 FPS) and thrive in others (90 FPS), it seems that the dynamic overclock will activate in those less stressful scenes, right?

The thing is, I'm not really going to benefit of an increase from 90 to 95 FPS, I need that performance increase when I notice the game is playing less smoothly, NOT when it is far beyond an acceptable frame rate.

All it seems GPU Boost will do is mislead me when sites publish Benchmarks as GPU Boost is artificially increasing the average frame rate by increasing the upper bound but not doing anything to fix scenes where the game is at 35~ FPS. The GPU with the lower AVG FPS could in fact be smoother as it could be more consistently playing around it's average.

eg. I could think one GPU is better than another just because the AVG. FPS is 3-10 FPS higher, but really, the average is being pushed up by a higher maximum FPS which does contribute to smooth gameplay.

I see no such things mentioned in any review.Did u theorize it yourself?I believe it will boost the gpu clock when it has more tdp headroom and the temperature is within defined thresholds.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Someone needs to test GPU boost on/"off" (by lowering the power setting) and display the fps in a graph, not just a number. Then one could see where the boost kicks in.

Btw when the card struggles at 30fps, the boost might still be active. High GPU load != TDP reached. It might...but it also might not. Depends on the situation.
 

dust

Golden Member
Oct 13, 2008
1,328
2
71
Someone needs to test GPU boost on/"off" (by lowering the power setting) and display the fps in a graph, not just a number. Then one could see where the boost kicks in.

Btw when the card struggles at 30fps, the boost might still be active. High GPU load != TDP reached. It might...but it also might not. Depends on the situation.

I think it would render the feature useless if so! Something else must be at work here:hmm:
 

Cythreill

Member
Apr 6, 2011
31
0
0
I see no such things mentioned in any review.Did u theorize it yourself?I believe it will boost the gpu clock when it has more tdp headroom and the temperature is within defined thresholds.

I haven't read reviews, most of this is from hearing an Nvidia representative talk about it on a video. From what he was saying, it seemed like GPU Boost would activate mostly in less demanding scenes.
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
I see no such things mentioned in any review.Did u theorize it yourself?I believe it will boost the gpu clock when it has more tdp headroom and the temperature is within defined thresholds.

Its pretty much this, along with the quality of the card itself. Like the 7900 ASIC quality we all started talking about, each 680 has its own "known" potential as well. If you set max TDP, the card will boost off of the base clock you set (using offset), to its limit. That's why +100 offset for me will bring me to the same base clock as another user (1006-1106MHz), but not necessarily the same boost clock.

My card pretty much stays pegged at max boost during usage.
 
Last edited:

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Please correct me if I am wrong.


Ok, you're so very, very wrong.

In practice, "gpu boost" is *always* functioning. You probably want to up the power to 132% max. In that case, if it needs it, it will stay at max speed as long as you keep it under 70C. If it is above 70C it will clock down 12Mhz or so. I think it has more slowdowns based on higher, but I never see it above 74C or so.

I'm not sure where you got this information, but it in no way works like you assume. In fact, it doesn't bother boosting unless it needs it. If you have vsync or adaptive vsync on and it can do 60 fps at 1006 Mhz, it will stay 1006 Mhz...

I OC mine to 1228Mhz, and it stays at 1215 Mhz all the time, and never goes over 120% power at the most in games. Even things like furmark, it stays 1215 Mhz.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,205
126
Ok, you're so very, very wrong.

In practice, "gpu boost" is *always* functioning. You probably want to up the power to 132% max. In that case, if it needs it, it will stay at max speed as long as you keep it under 70C. If it is above 70C it will clock down 12Mhz or so. I think it has more slowdowns based on higher, but I never see it above 74C or so.

I'm not sure where you got this information, but it in no way works like you assume. In fact, it doesn't bother boosting unless it needs it. If you have vsync or adaptive vsync on and it can do 60 fps at 1006 Mhz, it will stay 1006 Mhz...

I OC mine to 1228Mhz, and it stays at 1215 Mhz all the time, and never goes over 120% power at the most in games. Even things like furmark, it stays 1215 Mhz.

My GTX460 1GB OC cards (Gigabyte Windforce) hit 81-85C (with two cards installed) when running CUDA apps. If gaming stressed the 680 as much as it stresses my 460s, then it would never be boosted for me.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
GPU boost really just has boiled down to a fancy way of labeling something we've had in video cards for ages; downclocking at idle states. It's just extending on it to cases where you are in game and frame capped with no more need for extra gpu power.

Almost every game I play my cards are pegged at their max boosted clocks with the exception of some older games where I frame cap; TF2 etc. Games like Skyrim, BF3, SC2, heck even in Diablo 3 - which is a joke of a game graphically - my cards are maxed at 1235.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Okay, so GPU Boost activates based completely on temperature readings?


Temperature, power usage, and load.

If load is low, it doesn't do it.

If temp is too high, it does it less.

There is a cap on power usage so it will do it less so that it never crosses that power threshold, but the pwoer threshold can be adjusted to 132% of the default (and I'd suggest you do so). The stock fan profile is *very* conservative for noise. A bit too much so I think. I prefer the stock EVGA Precision fan curve, but I've tweaked it a bit. The card does not run very hot at all.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
My GTX460 1GB OC cards (Gigabyte Windforce) hit 81-85C (with two cards installed) when running CUDA apps. If gaming stressed the 680 as much as it stresses my 460s, then it would never be boosted for me.



With a software fan curve, even furmark doesn't pass mid 70's degrees for mine. And that's much quieter than my 480 was with the stock fan curve (which got in to the 90's on furmark).
 

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
I OC mine to 1228Mhz, and it stays at 1215 Mhz all the time, and never goes over 120% power at the most in games. Even things like furmark, it stays 1215 Mhz.

I don't know if that is a typo or what. Is this really a feature where you can OC to some value and the card actually runs 13mhz slower than the value you set? You consider this a positive thing?

I'd like to know the card is going to actually run at the speed I set, instead of some arbitrarily slower speed...
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
It's not arbitrary at all. I don't have the fan so aggressive that it stays under 70C, so it clocks down one notch. There is nothing arbitrary at all about it. I could trade noise for 12/13Mhz (things seem to be off by one due to rounding somewhere), but to me, it isn't worth the negligable performance difference.

I recognize that it is not an AMD product, so you have an irrational need to try to say bad things about it, but isn't that game getting a little old?
 

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
..

I OC mine to 1228Mhz



It's not arbitrary at all. I don't have the fan so aggressive that it stays under 70C, so it clocks down one notch.

Okay. So you OC yours to 1228, and it "clocks down one notch". Sounds to me like you do not actually clock your at 1228, you actually clock yours one notch below 1228.

I'm just curious how this works. If you set your card to 1215mhz, does it actually run at 1215 or does it clock down a notch again? What if you set it to 1200, does it "clock up one notch" and run at 1215? It just sounds to me like your card doesn't actually overclock at all, it lets you set a speed and then it runs at a different slower speed ignoring the speed you set.

That is not overclocking, that is the card running at the speed it wants to run at.
 

TidusZ

Golden Member
Nov 13, 2007
1,765
2
81
I had the same thoughts when I was reading reviews, it sounds like it runs as fast as it can but when its stressed it has to reduce clocks. In actuality, having had a card since release, it works sort of the opposite. In games like starcraft 2 and diablo 3 beta where the card isn't stressed it often runs well below stock speed (sc2 usually at around 700 mhz core, spiking up when 100 banelings are on the loose). In games like BF3 it will stay at the fastest offset the entire time though. I think the best part of GPU boost is that your videocard runs silent/cool during games like starcraft 2 while still pushing all the frames you need (it runs at ~700 mhz at about 120 fps in sc2).
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
I don't know if that is a typo or what. Is this really a feature where you can OC to some value and the card actually runs 13mhz slower than the value you set? You consider this a positive thing?

I'd like to know the card is going to actually run at the speed I set, instead of some arbitrarily slower speed...

It's not that hard to grasp. GTX680 automatically (doesn't matter what you do) downclocks whatever your boost is by ~13MHz once the card hits 70C. No way around it.

Also, there is another 13MHz downclock at 75C, another at 80C, probably another at 85C. It's every 5 degrees Celcius.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
I had the same thoughts when I was reading reviews, it sounds like it runs as fast as it can but when its stressed it has to reduce clocks. In actuality, having had a card since release, it works sort of the opposite. In games like starcraft 2 and diablo 3 beta where the card isn't stressed it often runs well below stock speed (sc2 usually at around 700 mhz core, spiking up when 100 banelings are on the loose). In games like BF3 it will stay at the fastest offset the entire time though. I think the best part of GPU boost is that your videocard runs silent/cool during games like starcraft 2 while still pushing all the frames you need (it runs at ~700 mhz at about 120 fps in sc2).

That has nothing to do with GPU boost, it is controlled by the Adaptive vs. Maximum Performance power setting in the CP, in combination with an fps cap and or vsync. Older cards work the same way as well.

GPU boost is snake oil for the consumer, it is designed to increase the average fps in benchmarks for nvidia. It does nothing to address minimum frame rates (stressful rendering = high power consumption and temperatures). If you make your card work hard enough, for example render a very difficult scene that i.e. only runs at 10 fps and pegs your GPU usage at 99%, your card will stay pegged at 1006 Mhz because of high power and temperature. If on the other hand you render an easy scene that runs at 60 fps capped and 50% GPU usage, Boost will raise the clocks as long as it has the power and temperature headroom to do so. Here you have an example of GPU boost doing absolutely nothing useful for you.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
I see alot of purposely obtuse comments. By the way, Anand has shown AMD cards throttle during gaming at stock clocks, with powertune in it's default position.

With a power equation established, AMD can then adjust GPU performance on the fly to keep power consumption under the TDP. This is accomplished by dynamically adjusting just the core clock based on GPU usage a few times a second. So long as power consumption stays under 250W the 6970 stays at 880MHz, and if power consumption exceeds 250W then the core clock will be brought down to keep power usage in check.
Anand , almost guessed Turbo was coming for gpu's.
Ultimately this is a negative feedback mechanism, unlike Turbo which is a positive feedback mechanism. Without overclocking the best a 6970 will run at is 880MHz, whereas Turbo would increase clockspeeds when conditions allow. Neither one is absolutely the right way to do things, but there’s a very different perception when performance is taken away, versus when performance is “added” for free. I absolutely like where this is going – both as a hardware reviewer and as a gamer – but I’d be surprised if this didn’t generate at least some level of controversy.
throttled.png
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
GPU boost is snake oil for the consumer, it is designed to increase the average fps in benchmarks for nvidia. It does nothing to address minimum frame rates (stressful rendering = high power consumption and temperatures). If you make your card work hard enough, for example render a very difficult scene that i.e. only runs at 10 fps and pegs your GPU usage at 99%, your card will stay pegged at 1006 Mhz because of high power and temperature. If on the other hand you render an easy scene that runs at 60 fps capped and 50% GPU usage, Boost will raise the clocks as long as it has the power and temperature headroom to do so. Here you have an example of GPU boost doing absolutely nothing useful for you.


I see in your sig you have one of these. It's obvious you haven't bothered actually watching the clocks because everything you've posted is pure fabrication with no basis in how the function actually works in practice.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Yes, AMD cards throttle due to Powertune to stay under TDP, if slider is left at stock. We're talking about the 680 throttling at 70C, whether it is at TDP limit or not.
Yes, I understand that. I was addressing the comments that alluded to Nvidia's turbo implementation as inferior or self defeating, and that any change in gpu settings (clocks/voltages) is somehow bad. When both companies are implementing some form of over-ride regardless of what people want to believe.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
These power saving features may very well make the cards work when they otherwise wouldn't but dynamic clocking and the subsequent impact on frame times is quite noticeable. Its a bit of a shame they don't have another solution really, because inconsistent performance isn't good when it comes to graphics.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
I think it would render the feature useless if so! Something else must be at work here:hmm:

Why? If there is TDP headroom, the card clocks higher. This might happen at 30fps, at 60, at 100. Two applications might load the GPU to 100% (according to Afterburner, GPU-Z etc.) but the actual consumption is different. So in app A it might clock higher than in B, giving additional performance.

What is this displayed load percentage based on, anyways?
 
Last edited: