• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Power difference per MHz difference

Edrick

Golden Member
Is there a formula and/or easy way to tell the difference in wattage used during frequency scaling of a CPU?

For example what is the power difference between a i5 750 and 760 both running at stock speeds? 2.66Ghz vs 2.8Ghz (assuming the same CPU stepping).
 
I will have to find a graph showing this, but the relationship is usually somewhat exponential as you start to hit their upper limits. That said, the differences between the 750 and 760 would be very small, likely a few watts at idle and a couple at load. You will not see a big different with an i5/i7 build until you start to hit around 3.6-3.8 and the power requirements really start to increase.
 
There is a frequency to power calculation, but I can't remember it. IDC probably knows it off the top of his head however.
 
The formula is something like

power = freq. * voltage^2 * capacitance

(Don't quote me on this formula. I'm probably missing a thing or two.)

I believe capacitance is a fixed value for a given processor.

It should be quite apparent that when we increase frequency, power consumption increases linearly (or close to it, who knows what other factors it affects).

Increasing voltage increases power consumption exponentially, which would explain why increasing voltage is dangerous - with the increased power consumption, we also increase power dissipation.
 
Delta Power = Delta Clockspeed * (Delta Voltage)^2

I apologize for the poor format, but that's the formula.


EDIT: Beaten to the answer by the above post.
 
The power difference between "an" i5 750 and "an" i5 760 will be effectively nothing, since it all depends mainly on the voltage a specific chip is running at (which can vary). They are typically binned for the same power envelope though, and you can get energy efficient chips (not necessarily i5's) which run the same speed with lower voltages, because voltage is really what's most important (although frequency does have an impact).

Trying to distinguish between two similarly clocked processors form the same family with the same sort of binning process isn't worth it.

The best way is to look at overclocking guides/examples, which often show the maximum overclock for a single CPU at stock voltage, which will show you power scaling with frequency.

graph1.jpg


Here you can see a >10% increase in CPU frequency (at stock volts) resulting in a less than 10% increase in power use on a CPU. If you take into account different voltages between CPUs, such a power discrepancy could easily be masked by a slower clocked CPU having a slightly higher stock voltage. When the voltage gets increased (the yellow part) it starts increasing power use more significantly.

http://www.anandtech.com/show/3742/intels-core-i5655k-core-i7875k-overclocked-and-analysed-/3
 
The formula is something like

power = freq. * voltage^2 * capacitance

(Don't quote me on this formula. I'm probably missing a thing or two.)

I believe capacitance is a fixed value for a given processor.

But variable depending on the workload. 🙂
 
Is there a formula and/or easy way to tell the difference in wattage used during frequency scaling of a CPU?

For example what is the power difference between a i5 750 and 760 both running at stock speeds? 2.66Ghz vs 2.8Ghz (assuming the same CPU stepping).

Provided the operating temperatures are reasonably similar, the power-consumption will scale linearly with frequency unless you are also changing the voltage.

Old solid-state (legacy cmos, pre-65nm) device physics held that the dynamic power-consumption scaled to the square of the voltage.

However in practice, ever since 65nm we tend to observe cmos device's power-consumption scales with the cube of the voltage.

See this post and the thread it is in: http://forums.anandtech.com/showpost.php?p=27459192&postcount=20
VccversusPowerConsumption.gif


(notice the pre-multipliers for both the square and linear terms are infinitesmially small, basically the general forumla is y = m x^3 + b)
 
Adding that every chip will use a different amount of power as well due to imperfections regardless of voltage or frequency.
 
Since they're binned according to the frequency achieved at TDP, both an i5 750 and i5 760 will have similar power usage at their native frequencies. Of course, an i5 760 will use less power at a given workload and speed than the i5 750.
 
Back
Top