Electric bill savings on i5 vs i7

jinduy

Diamond Member
Jan 24, 2002
4,781
1
81
Has anyone done the math on how much one saves each year assuming you keep the machine up 24x6 or so?

i think the i7 runs at 130w and the i5 is 90w.

i recall reading one place where they said the average savings on the i5 vs i7 is like $30... which is piddly over the course of the year.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
Has anyone done the math on how much one saves each year assuming you keep the machine up 24x6 or so?

i think the i7 runs at 130w and the i5 is 90w.

How much energy does the computer use when in sleep mode? Is there a difference in that when comparing Core i5 and Core i7?
 

Jd007

Senior member
Jan 1, 2010
207
0
0
TDPs of 130W and 90W are for full load. I doubt that you will keep you machine fully loaded 24/7 (unless you fold in which case power consumption isn't important any more). So what you need to compare are the idle power consumptions plus the TDPs for when you actually use your machine for heavy duty stuff.
 

jinduy

Diamond Member
Jan 24, 2002
4,781
1
81
okay i suck at math but here's what i found from http://michaelbluejay.com/electricity/cost.html

i didn't read too closely so i probably made silly mistakes:

avg kWh (kw hour) cost in california is 12 cents

so 30w * 1kw/1000w ($.12/1kw) * 24hours* 365days = $31.54 per year

i suppose if you keep your computer for over 4 or 5 years, then it becomes worth your time
 

nyker96

Diamond Member
Apr 19, 2005
5,630
2
81
okay i suck at math but here's what i found from http://michaelbluejay.com/electricity/cost.html

i didn't read too closely so i probably made silly mistakes:

avg kWh (kw hour) cost in california is 12 cents

so 30w * 1kw/1000w ($.12/1kw) * 24hours* 365days = $31.54 per year

i suppose if you keep your computer for over 4 or 5 years, then it becomes worth your time

this isn't accurate, this 30W difference is difference in full load situation but most people keep the system idle most of the time. I don't think the idle difference is as great. but even with this overestimate it's clear the power saving might not be as great unless you use the computer quite a lot every day. plus you got other factors, i7 920s can oc a little further on lower voltages etc.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
this isn't accurate, this 30W difference is difference in full load situation but most people keep the system idle most of the time. I don't think the idle difference is as great.

No, 30 watts is the idle difference between Core i5 750 and Core i7 920 (according to Anandtech tests)
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
24/6? is this a typo or do you turn off your computer one day a week?

1 watt * 1kilowatt/1000 watt * 24 hours/day * 365 days/year * 0.07$/kilowatthour = 0.6132 $/year @ 1 watt @ 7 cents per KWH.
@ 6 days a week you want to multiply it by 6/7 to get 0.5256$/year @ 1 watt @ 7 cents per KWH.

many people pay more for their electricity though... at a more realistic 11 cents per KWH you do:
1 watt * 1kilowatt/1000 watt * 24 hours/day * 365 days/year * 0.11$/kilowatthour = 0.9636 $/year @ 1 watt @ 11 cents per KWH.
1.0512 $/year @ 1 watt @ 12 cents per KWH.

A good rough estimate for those unwilling to do the math is 1$ per 1 watt per 24/7 on device... I do the math before every purchase.
actually... 1.0512 / 12 = 0.0876. Thats the ratio you can use to build the following equation:

For an always on device (24/7): year power cost in $/year = 0.0876*E*P
where E = Power in watts
where P = Cost per Kilowatt hour in Cents

so 30 watts in a place that cost 14 cents per KWH will give you: $/year = 0.0876*30*14 = 36.792 $/year
to convert this to a machine that is on 6 out of 7 days, multiply this value by 6/7
 
Last edited:

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Actually, TDP is a relative good measure for power consumption because TDP itself was derived from application power usage. Intel for example, used to show power usage for each speed grades, but moved to a TDP system.

I figure this is how TDP works.

130W TDP means that the highest speed of the "130W family" will use close to 130W. Those that aren't top clocks but still rated 130W are likely 130W top end parts binned to a lower clock.

If you guys ever took a look at the datasheets, you'll notice that the CPU with lower "Load TDP" will have lower power usage in all other power states as well. It is probably because within the same family(ie. Nehalem vs Nehalem, or Desktop CPU vs Desktop CPU), the power usage will have to be similarly lower everywhere to achieve lower TDP.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Actually, TDP is a relative good measure for power consumption because TDP itself was derived from application power usage. Intel for example, used to show power usage for each speed grades, but moved to a TDP system.

this is why I used measure consumption at the wall done by independent third parties (aka, anandtech and the like; or myself) to compare the wattage of various things.

TDP = Thermal Dissipate Power
It is a measurement of the size of the recommended heatsink / custom cooling for the device for manufacturers. Not a measurement of how much power it consumes.
 

Udgnim

Diamond Member
Apr 16, 2008
3,683
124
106
need to also take into consideration power usage when overclocking. some CPUs have an overclocking threshold where they'll just start sucking tons of power.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Actually its "Thermal Design Power"

;)

BTW, if you read into datasheets more or look up more about TDP it specifically says "Maximum power an application can realistically dissipate". This point is actually why AMD marketing came up with ACP saying TDP isn't really max power, but since the Core 2, it can only reach TDP with unrealistic code like a power virus, TDP is max power.

Of course, you can look it as "dissipated power" too, because power is related to heat dissipation.
 

bradley

Diamond Member
Jan 9, 2000
3,671
2
81
You could also try undervolting the chip at idle and load for extra power savings.
 

wwswimming

Banned
Jan 21, 2006
3,695
1
0
Has anyone done the math on how much one saves each year assuming you keep the machine up 24x6 or so?

i think the i7 runs at 130w and the i5 is 90w.

i recall reading one place where they said the average savings on the i5 vs i7 is like $30... which is piddly over the course of the year.

it depends a whole lot on your marginal electricity rate. if you're paying 30 cents a kilowatt hour vs. 10 cents a kilowatt hour - vs. 6 cents a kilowatt hour, e.g. in Canada.

for individual users with single computers, it does seem piddly.

if you're involved with computers for business ... it mounts up.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
even for an individual user, saving 10$ on a product only to pay 30$ more in electricity per year seems silly.
 

alkemyst

No Lifer
Feb 13, 2001
83,769
19
81
even for an individual user, saving 10$ on a product only to pay 30$ more in electricity per year seems silly.

For a single user probably this is fodder, however; there is more to just the wattage difference. There is also a difference in heat output. It's the heat output that causes the majority of the money in HVAC costs.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
For a single user probably this is fodder, however; there is more to just the wattage difference. There is also a difference in heat output. It's the heat output that causes the majority of the money in HVAC costs.

a single person at home needs to do the same... typically it costs 3x the power to cool something... so extra 30 watt hours used probably means an additional ~90 watt hours on air conditioning.

20$ a year might be peanuts, but I am cheap... and many of the people making buying decisions are as cheap as me.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,131
3,667
126
i'll turn off an extra light bulb instead of going down to an i5 thank you.

Or i'll have my TV off instead of both at the same time when im on my computer.

Although im a power hog, and i admit it..
 
Last edited:

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Firstly, why are you comparing the 750 to the 920? those aren't even in the same league, different chipset, different memory system, different target audience. Compare the 750 to something like an 860, and there's only a few watts idle power consumption difference.

Comparing the load difference, it depends on how well the load scales with HT. If the load scales well, then for a given amount of work the 860 and 870 are actually more power efficient than the 750 because they will finish the work unit faster.

http://www.techreport.com/articles.x/17545/13
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,131
3,667
126
or you can do BOTH.
this is like "smoking isn't bad for me because there is air pollution". no, it is still bad.

thats like telling me to stop breathing but dont pass out or die.
:eek:
 

alkemyst

No Lifer
Feb 13, 2001
83,769
19
81
a single person at home needs to do the same... typically it costs 3x the power to cool something... so extra 30 watt hours used probably means an additional ~90 watt hours on air conditioning.

20$ a year might be peanuts, but I am cheap... and many of the people making buying decisions are as cheap as me.

I understand that it happens for one PC vs a whole office.

You have to be pretty nuts to base a CPU purchase on saving $20 or so dollars a year.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I understand that it happens for one PC vs a whole office.

You have to be pretty nuts to base a CPU purchase on saving $20 or so dollars a year.
no kidding. that makes very little difference in the whole scheme of things for an individual user. if someone is going to over analyze such a minuscule power savings between cpus, I would hate to see how they are in other aspects of their life.
 

jinduy

Diamond Member
Jan 24, 2002
4,781
1
81
so i'm still wondering why do people bother caring about power consumption on a cpu at all?
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
so i'm still wondering why do people bother caring about power consumption on a cpu at all?

It saves money. You just want to know how much? You asked a good question.
 
Last edited: