how much power are you saving by not overclocking?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

dflynchimp

Senior member
Apr 11, 2007
468
0
71
Believe me, if I had the wallet to upgrade every year or two I would. But after burning out a Pentium 4 805 within a year by running it at 3.8GHz 24/7 I've decided to take it easy on my q6600, which has to last me a while, probably until Intel's next major archie, after Nehalem.
 

Gillbot

Lifer
Jan 11, 2001
28,830
17
81
Believe me, if I had the wallet to upgrade every year or two I would. But after burning out a Pentium 4 805 within a year by running it at 3.8GHz 24/7 I've decided to take it easy on my q6600, which has to last me a while, probably until Intel's next major archie, after Nehalem.

I ran my Q6600 at it's limit and it lasted quite a while. IMHO, if you are destroying hardware that easily by OC'ing, you are doing something wrong. Likely not enough prep on the board and inadequate cooling for the components.
 

mindless1

Diamond Member
Aug 11, 2001
8,756
1,761
136
Let me ask this: If when you bought the CPU, they had offered the next higher models for the same price would you have bought the higher performance CPU instead? Most would, but in doing so you also increase power consumption all else being equal, so it is a hard call whether it's worse to do so at no additional (product) cost rather than what others end up paying more for.

There's modest, average, and extreme overclocking. Many sensitive to the power cost will simply refrain from much of a voltage increase and only push the clockspeed within that power envelope, keeping in mind that when you don't need that performance level the CPU can use power management and HALT-IDLE commands to get down to a mere fraction of the previous power level - and sooner when it's running at higher performance from o'c.

There is no simple answer to the question. Some people, by overclocking will get demanding jobs done sooner and the system goes to sleep/off sooner or as mentioned above it at least idles sooner. Others by overclocking will be able to reach higher FPS in games which makes their video card consume more power even if the video card isn't overclocked - depends on the game and monitor used.

Some disable modern power management features with a true, or false, impression it helps their o'c. Others will leave power management features enabled to downclock and undervolt.

Then there is the question of why you are trying to save power. Because it's greener, because it costs less on the electric bill, because you can use a cheaper heatsink and lower fan RPM? Sometimes these things are true, but if you instead replace parts more frequently to keep a high performance system, your cost rises and making, advertising, distributing, selling, delivering, testing, etc., that new set of parts isn't exactly an energy-free proposition either.

The simple answer is there isn't one to a generic question. Get a power meter like a Kill-A-Watt and measure each individual system in any configuration you might consider running... but that will use power too. If you really want to save power, don't leave a semi-or-better performance system running any longer than it needs to.
 
Last edited:

dflynchimp

Senior member
Apr 11, 2007
468
0
71
Well if we were only focused on power savings and nothing but we'd all be running netbooks or not using computers at all.

The ideal balance is to have a system powerful enough for whatever needs you have for it (eg. games at high quality), and spend not a penny more for it on both hardware and overclocking power costs. Also taken into account is longevity of the system, how long you can stretch it until the next upgrade cycle.


That balance however is different for everyone.

Overclocking saves you money on a short term basis since you get more performance than you pay for, but eventually, given enough time, the difference in cost will be cancelled out by the extra power consumptionl.
 

lothar

Diamond Member
Jan 5, 2000
6,674
7
76
That article seems a bit moronic.
I've never seen anyone require a Vcore of 1.5v to reach 4GHz. Most people require 1.20-1.30v max.
Hell, not even Lynnfield requires anywhere near that to reach 4GHz.

The question I have is...What the heck are they measuring?

I'd guess they were testing using a C0 chip.
I'd like to see the article re-done on a D0 one.

Yes, I'm quoting myself.
 

mindless1

Diamond Member
Aug 11, 2001
8,756
1,761
136
Well if we were only focused on power savings and nothing but we'd all be running netbooks or not using computers at all.

The ideal balance is to have a system powerful enough for whatever needs you have for it (eg. games at high quality), and spend not a penny more for it on both hardware and overclocking power costs. Also taken into account is longevity of the system, how long you can stretch it until the next upgrade cycle.


That balance however is different for everyone.

Overclocking saves you money on a short term basis since you get more performance than you pay for, but eventually, given enough time, the difference in cost will be cancelled out by the extra power consumptionl.

This is what I thought your agenda was, and as I briefly mentioned it is not true in many many cases.

It is not overclocking that wastes power, it is individual choices. As mentioned, an overclocked system can also throttle (downclock and downvolt) to the point where the only time the calcs for addt'l voltage mentioned are valid, is when it runs at full throttle, and the only times those are valid is until you finish some linear task where it would need to run at full throttle.

In that case, the system is done sooner and sleeps sooner.

In fact, anyone who is thinking about power consumption, and about power usage, will be far ahead by conservative overclocking than not doing so. The reason is simple, power isn't THAT expensive, and any money saved earns interest.

I have to suspect this was either a troll post or you are doing what green-heads do, ignoring all facts and trying to focus on only the most simple of ideas, damn the facts. Fanny how you so easily, deliberately ignored my question about buying a higher tiered non-overclocked product and tried to counter with netbooks, by ignoring it you showed your agenda.

The truth is, contrary to your claim that "given enough time" it cancels out, it is the exact opposite. It starts out as an equal tradeoff if you set your usage and power management properly, and over time, the savings in cost and power grow even more and more!

Remember, overclocking a CPU so it gets 30% more performance, doesn't come close to causing a 30% higher total system energy usage in almost all cases. If all you do is low-performance things, by all means buy some low performance CPU like an Atom, but it doesn't cover cases where the rest of us do need more performance, nor does it cover that even o'c CPU can downclock and downvolt to where the difference is not in TDP, it's in barely perceptible numbers.

Please study the science more before you come here pretending to ask in order to try and preach because most of us have been there, seen that before.

Bottom line as I already mentioned, if you really want to save power and still have the performance level you need at a bare minimum, is to turn off the system when you are finished.

Now a hint- If your overclocking causes a job to finish 30% sooner, at which point your system shuts off, you have SAVED power versus running the whole system longer because it isn't just the CPU that consumes power, on the contrary a CPU is usually less than half the total system power.