I'm completely dumbfounded - Lower temperatures per given clock speed when OCing!

hsjj3

Member
May 22, 2016
127
0
36
First off, Power limit, Temperature limit and Fan curve are identical.

Stock: 1216/1279 with boost 2.0 max of 1418MHz

I never over clocked in my life before, but with MSI Afterburner, it's just too tempting, and got the better of me today.

To my knowledge, overclocking causes a rise in temperature. Given that I was already hitting 80C at stock settings, I absolutely didn't think it would actually benefit me.

But lo and behold, I added first a 45MHz OC, then bumped it up to 80Hz. This is for core, memory is the same.

And to my biggest surprise, at a given temperature, I was now running a much higher clock!

After 20 minutes of 99% GPU load stress testing, at stock settings it goes from 1418 to 1354/1367.

But after OC, I'm letting the stress test now for 30 minutes, and it's not gone below 1418MHz! It is oscillating at 1430/1418MHz.

Fan profile is 45%, 80C temp limit and 100% power limit., both before and after OC.

Somebody please explain this sorcery to me!

Edit: I am also using the exact same voltage and power. At 1206mV before the OC, I run at 1418MHz. After the OC, at the same 1206mV level, I am running at 1497. I could go even higher, but at the moment I don't even understand what's happening that I rather get a handle of how this OC business works.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
An 80 MHz overclock is only 5~6%. That difference can be compensated by a drop in room temperature by 1°C for example.
I can oc my own card from 985 MHz to 1100 MHz and it will only add around 20W to the power meter at the wall, the big power increases are when you add voltage to get that last bit of overclock out of your chip.
 

hsjj3

Member
May 22, 2016
127
0
36
An 80 MHz overclock is only 5~6%. That difference can be compensated by a drop in room temperature by 1°C for example.
I can oc my own card from 985 MHz to 1100 MHz and it will only add around 20W to the power meter at the wall, the big power increases are when you add voltage to get that last bit of overclock out of your chip.

Thing is, power usage is exactly the same as well. And I only tried 80MHz. I may or may not have hit my ceiling without seeing any negative effects.

My power usage before averaged 92%, and after it is 92% as well. Given that heat is directly related to the power consumption, and given the fact that power consumption doesn't at all increase, then it's fair to say I am correct in saying I don't see any rise in temps.

This is what I would like some expert input on. Everywhere I read on the Internet it says with an overclock, power and temperature rises. I am NOT seeing that and I would like an explanation of how is this possible.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
nvidia cards have a table of clockspeed and corresponding voltage. If the power limit and temperature allows it will go to a higher clockspeed.

When you're overclocking you are increasing the clockspeed values, but not the voltages, so your voltage and thus power use and temperature will be lower for a given clockspeed.
 

hsjj3

Member
May 22, 2016
127
0
36
nvidia cards have a table of clockspeed and corresponding voltage. If the power limit and temperature allows it will go to a higher clockspeed.

When you're overclocking you are increasing the clockspeed values, but not the voltages, so your voltage and thus power use and temperature will be lower for a given clockspeed.

This is very sensible. I have mapped out a table for the voltage and clock speed. After applying the +80MHz core clock, at each voltage step I get about 80MHz more speed.

So my question is, why is the Internet filled with overclocking articles and guides for Nvidia GPUs saying an OC will contribute to higher power draw and hence higher temps? Cause in my case, it's not true.

And this begs the question...what's the downside of what I've done? Is this truly a "free" performance boost, or is there a catch?

Also, why don't GPU manufacturers exploit the full extend of the GPU, if it doesn't lead to increased power consumption nor heat production?
 

coercitiv

Diamond Member
Jan 24, 2014
7,226
16,986
136
Is this truly a "free" performance boost, or is there a catch?
All chips are powered in such a way as to provide a safety margin to avoid errors. What you're doing right now is increasing clocks in detriment of this safety margin. Depending on the quality of your chip and how high that safety margin is, it may still be able to clock higher or not.

To put it bluntly, just because you change one parameter in a system and see no (measurable) temperature increase from a 5% overclock does not mean you're on to something.

Make it a 10-15% stable overclock, see the temperatures and power readings stay the same, then wonder if you have beaten modern science & the internet :)
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
Well, usually overclockers set the power limit to the max and may increase voltage too, although nvidia cards have a very safe hard limit afaik.

The catch is it might not be stable, which would cause parts of a game to not be rendered correctly or games could crash. Manufacturers don't have time to test the limits of stability so there's often a bit of "free performance" left.

edit: beaten :)
 

Tumaras

Member
May 23, 2016
29
0
0
Overclocking alone isn't going to affect temps much as opposed to bumping voltage up, which it sounds like you didn't do. People associate temperature increases with overclocking because usually you are upping voltage as part of that. As you push things closer to the threshold of the max of what your gpu will do, you will need to up voltage (or choose not to push it further) and your temps will rise. That's pretty much just thermodynamics.