• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Electric bill savings on i5 vs i7

jinduy

Diamond Member
Has anyone done the math on how much one saves each year assuming you keep the machine up 24x6 or so?

i think the i7 runs at 130w and the i5 is 90w.

i recall reading one place where they said the average savings on the i5 vs i7 is like $30... which is piddly over the course of the year.
 
Last edited:
Has anyone done the math on how much one saves each year assuming you keep the machine up 24x6 or so?

i think the i7 runs at 130w and the i5 is 90w.

How much energy does the computer use when in sleep mode? Is there a difference in that when comparing Core i5 and Core i7?
 
TDPs of 130W and 90W are for full load. I doubt that you will keep you machine fully loaded 24/7 (unless you fold in which case power consumption isn't important any more). So what you need to compare are the idle power consumptions plus the TDPs for when you actually use your machine for heavy duty stuff.
 
okay i suck at math but here's what i found from http://michaelbluejay.com/electricity/cost.html

i didn't read too closely so i probably made silly mistakes:

avg kWh (kw hour) cost in california is 12 cents

so 30w * 1kw/1000w ($.12/1kw) * 24hours* 365days = $31.54 per year

i suppose if you keep your computer for over 4 or 5 years, then it becomes worth your time
 
okay i suck at math but here's what i found from http://michaelbluejay.com/electricity/cost.html

i didn't read too closely so i probably made silly mistakes:

avg kWh (kw hour) cost in california is 12 cents

so 30w * 1kw/1000w ($.12/1kw) * 24hours* 365days = $31.54 per year

i suppose if you keep your computer for over 4 or 5 years, then it becomes worth your time

this isn't accurate, this 30W difference is difference in full load situation but most people keep the system idle most of the time. I don't think the idle difference is as great. but even with this overestimate it's clear the power saving might not be as great unless you use the computer quite a lot every day. plus you got other factors, i7 920s can oc a little further on lower voltages etc.
 
this isn't accurate, this 30W difference is difference in full load situation but most people keep the system idle most of the time. I don't think the idle difference is as great.

No, 30 watts is the idle difference between Core i5 750 and Core i7 920 (according to Anandtech tests)
 
24/6? is this a typo or do you turn off your computer one day a week?

1 watt * 1kilowatt/1000 watt * 24 hours/day * 365 days/year * 0.07$/kilowatthour = 0.6132 $/year @ 1 watt @ 7 cents per KWH.
@ 6 days a week you want to multiply it by 6/7 to get 0.5256$/year @ 1 watt @ 7 cents per KWH.

many people pay more for their electricity though... at a more realistic 11 cents per KWH you do:
1 watt * 1kilowatt/1000 watt * 24 hours/day * 365 days/year * 0.11$/kilowatthour = 0.9636 $/year @ 1 watt @ 11 cents per KWH.
1.0512 $/year @ 1 watt @ 12 cents per KWH.

A good rough estimate for those unwilling to do the math is 1$ per 1 watt per 24/7 on device... I do the math before every purchase.
actually... 1.0512 / 12 = 0.0876. Thats the ratio you can use to build the following equation:

For an always on device (24/7): year power cost in $/year = 0.0876*E*P
where E = Power in watts
where P = Cost per Kilowatt hour in Cents

so 30 watts in a place that cost 14 cents per KWH will give you: $/year = 0.0876*30*14 = 36.792 $/year
to convert this to a machine that is on 6 out of 7 days, multiply this value by 6/7
 
Last edited:
Actually, TDP is a relative good measure for power consumption because TDP itself was derived from application power usage. Intel for example, used to show power usage for each speed grades, but moved to a TDP system.

I figure this is how TDP works.

130W TDP means that the highest speed of the "130W family" will use close to 130W. Those that aren't top clocks but still rated 130W are likely 130W top end parts binned to a lower clock.

If you guys ever took a look at the datasheets, you'll notice that the CPU with lower "Load TDP" will have lower power usage in all other power states as well. It is probably because within the same family(ie. Nehalem vs Nehalem, or Desktop CPU vs Desktop CPU), the power usage will have to be similarly lower everywhere to achieve lower TDP.
 
Actually, TDP is a relative good measure for power consumption because TDP itself was derived from application power usage. Intel for example, used to show power usage for each speed grades, but moved to a TDP system.

this is why I used measure consumption at the wall done by independent third parties (aka, anandtech and the like; or myself) to compare the wattage of various things.

TDP = Thermal Dissipate Power
It is a measurement of the size of the recommended heatsink / custom cooling for the device for manufacturers. Not a measurement of how much power it consumes.
 
need to also take into consideration power usage when overclocking. some CPUs have an overclocking threshold where they'll just start sucking tons of power.
 
Actually its "Thermal Design Power"

😉

BTW, if you read into datasheets more or look up more about TDP it specifically says "Maximum power an application can realistically dissipate". This point is actually why AMD marketing came up with ACP saying TDP isn't really max power, but since the Core 2, it can only reach TDP with unrealistic code like a power virus, TDP is max power.

Of course, you can look it as "dissipated power" too, because power is related to heat dissipation.
 
Has anyone done the math on how much one saves each year assuming you keep the machine up 24x6 or so?

i think the i7 runs at 130w and the i5 is 90w.

i recall reading one place where they said the average savings on the i5 vs i7 is like $30... which is piddly over the course of the year.

it depends a whole lot on your marginal electricity rate. if you're paying 30 cents a kilowatt hour vs. 10 cents a kilowatt hour - vs. 6 cents a kilowatt hour, e.g. in Canada.

for individual users with single computers, it does seem piddly.

if you're involved with computers for business ... it mounts up.
 
even for an individual user, saving 10$ on a product only to pay 30$ more in electricity per year seems silly.

For a single user probably this is fodder, however; there is more to just the wattage difference. There is also a difference in heat output. It's the heat output that causes the majority of the money in HVAC costs.
 
For a single user probably this is fodder, however; there is more to just the wattage difference. There is also a difference in heat output. It's the heat output that causes the majority of the money in HVAC costs.

a single person at home needs to do the same... typically it costs 3x the power to cool something... so extra 30 watt hours used probably means an additional ~90 watt hours on air conditioning.

20$ a year might be peanuts, but I am cheap... and many of the people making buying decisions are as cheap as me.
 
i'll turn off an extra light bulb instead of going down to an i5 thank you.

Or i'll have my TV off instead of both at the same time when im on my computer.

Although im a power hog, and i admit it..
 
Last edited:
Firstly, why are you comparing the 750 to the 920? those aren't even in the same league, different chipset, different memory system, different target audience. Compare the 750 to something like an 860, and there's only a few watts idle power consumption difference.

Comparing the load difference, it depends on how well the load scales with HT. If the load scales well, then for a given amount of work the 860 and 870 are actually more power efficient than the 750 because they will finish the work unit faster.

http://www.techreport.com/articles.x/17545/13
 
or you can do BOTH.
this is like "smoking isn't bad for me because there is air pollution". no, it is still bad.

thats like telling me to stop breathing but dont pass out or die.
😱
 
a single person at home needs to do the same... typically it costs 3x the power to cool something... so extra 30 watt hours used probably means an additional ~90 watt hours on air conditioning.

20$ a year might be peanuts, but I am cheap... and many of the people making buying decisions are as cheap as me.

I understand that it happens for one PC vs a whole office.

You have to be pretty nuts to base a CPU purchase on saving $20 or so dollars a year.
 
I understand that it happens for one PC vs a whole office.

You have to be pretty nuts to base a CPU purchase on saving $20 or so dollars a year.
no kidding. that makes very little difference in the whole scheme of things for an individual user. if someone is going to over analyze such a minuscule power savings between cpus, I would hate to see how they are in other aspects of their life.
 
Back
Top