how much energy does a computer use?

apac

Diamond Member
Apr 12, 2003
6,212
0
71
I've left mine on 24/7 for quite some time now but have never paid the utilities bill for it. Last month (May) I moved in about the 10th and have basically left the computer running since. That month's utilities bill was $46 each split 5 ways, this months was $70 EACH split 6 ways!! We looked into it and apparantly our energy use is through the roof.

So my question is, how much power does a computer use if it's left on (and functioning) all the time. I run DC++ so it never goes on standby, only the monitor turns off. It's an XP 2500+ with a 350W PS.
 

Wallydraigle

Banned
Nov 27, 2000
10,754
1
0
Originally posted by: apac
I've left mine on 24/7 for quite some time now but have never paid the utilities bill for it. Last month (May) I moved in about the 10th and have basically left the computer running since. That month's utilities bill was $46 each split 5 ways, this months was $70 EACH split 6 ways!! We looked into it and apparantly our energy use is through the roof.

So my question is, how much power does a computer use if it's left on (and functioning) all the time. I run DC++ so it never goes on standby, only the monitor turns off. It's an XP 2500+ with a 350W PS.



350W?
 

beer

Lifer
Jun 27, 2000
11,169
1
0
Originally posted by: lirion
350W?

No, switching power supplies only draw as much power as necessary. They aren't linear. You probably draw 250W an average.

Power is eleven cents per killowatt hour, on average. You run a PC for four hours, you consume 1 killowatt. So roughly it costs about 2.7 cents an hour to run.
 

beer

Lifer
Jun 27, 2000
11,169
1
0
Originally posted by: brigden
I think the average PC uses about $11 worth of energy a year.

Not even close. Probably closer to $15 a month if your nominal draw is 250W.
 

apac

Diamond Member
Apr 12, 2003
6,212
0
71
Originally posted by: beer
Originally posted by: lirion
350W?

No, switching power supplies only draw as much power as necessary. They aren't linear. You probably draw 250W an average.

Power is eleven cents per killowatt hour, on average. You run a PC for four hours, you consume 1 killowatt. So roughly it costs about 2.7 cents an hour to run.

OK so doing the math...it costs roughly 70 cents a day to keep my computer on without power saving. Thanks, that means our ginormous electricity bill is from something else...any ideas?
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Ok, a quick calculation at full load comes out to 350wts * 24 hours * 30 days / 1000 watts(watts->kWh) * $.06(dollars per kWh) comes out to 15.12, which is the absolute maximum. Since a computer won't constantly use all 350wts, the actual amount would be lower. Replace the energy price in the equation with the cost of energy in your area to get a better estimate.
 

Wallydraigle

Banned
Nov 27, 2000
10,754
1
0
Originally posted by: apac
Originally posted by: beer
Originally posted by: lirion
350W?

No, switching power supplies only draw as much power as necessary. They aren't linear. You probably draw 250W an average.

Power is eleven cents per killowatt hour, on average. You run a PC for four hours, you consume 1 killowatt. So roughly it costs about 2.7 cents an hour to run.

OK so doing the math...it costs roughly 70 cents a day to keep my computer on without power saving. Thanks, that means our ginormous electricity bill is from something else...any ideas?



Are you running an air conditioner?
 

RIGorous1

Platinum Member
Oct 26, 2002
2,053
0
71
I gonna try to give a stab at it, but I might be partially wrong so don't rely on this:

Well since you have a 350w PS lets make the following assumptions:

Output depends on usage, and output cannot exceed 350w (assuming that is a true rating). So lets assume that your rig is sucking 300w constantly if left on with near full usage (this will vary by app).

So 300 watts per hour
24 x 300 = 7.2kilowatt-hours per day
(30 day month) x 7.2kwh = 216kwh per month

multiply this by how much your utility charges per Kwh (should be on your bill) and you have your total per month. Again this is a number is a ballpark is a function of certain assumptions that may be untrue or not precise. This is for your computer only. other peripherals such as monitor, printer, etc. are not factored in.
 
Aug 23, 2000
15,509
1
81
really depends on what is in it. How many hdd, fans? ect. The 35W psu means it maxes out in putting out 350W's which is a measure of electricity per hr.
So if you average it at that extreme you are using 252KWs a month. I pay $.10 a KW so that would be $25.20 for the computer, If you want to save electricity ,turn the temp down on your water heater, turn the minimum temp on your a/c up 2 degrees, lower the temp in your frezzer/fridge, the "warmest" setting is still enough to keep food cold. Also, turn of lights that don't need to be used.
 

beer

Lifer
Jun 27, 2000
11,169
1
0
Originally posted by: apac
OK so doing the math...it costs roughly 70 cents a day to keep my computer on without power saving. Thanks, that means our ginormous electricity bill is from something else...any ideas?

Your standard oven draws about 6 kW, but it isn't on much. In most households not in the north, your largest expense is going to be AC - while it varies greatly from across the region, a 9000 BTU system (3/4 ton) draws something like 1 kW, and a 9000 BTU system will only be effective, if I recall correctly, at cooling two rooms in a hot climate. I think our system in my house is something like 3.5 kW and runs for about six hours a day. This is in Texas, though, so I figure that it costs about $70 a month or something, just by itself.

Your largest power consumers are ovens, water heaters, and probably ACs.
 

thraxes

Golden Member
Nov 4, 2000
1,974
0
0
Nope... 350 is peak wattage, the maximum the PSU can handle.

A loaded PC (P4 3GHz, R9800, 2 HDDs, DVD Burner, CD Burner, 3 PCI cards) running light loads (only doing P2P or light server work) will pull about 170 - 200 Watts
Add another 50 Watts if the system is doing distributed computing work that keeps the CPU at 100%.
 

David101

Member
Jul 13, 2003
69
0
66
aren't power supplies like 70% efficient, so they take more AC power in than their rating? (like 400w instead of the 350 rating)
 

RobCur

Banned
Oct 4, 2002
3,076
0
0
Originally posted by: apac
Originally posted by: beer
Originally posted by: lirion
350W?

No, switching power supplies only draw as much power as necessary. They aren't linear. You probably draw 250W an average.

Power is eleven cents per killowatt hour, on average. You run a PC for four hours, you consume 1 killowatt. So roughly it costs about 2.7 cents an hour to run.

OK so doing the math...it costs roughly 70 cents a day to keep my computer on without power saving. Thanks, that means our ginormous electricity bill is from something else...any ideas?

You are forgetting router, monitor and any other if not mentioned.
 

jagec

Lifer
Apr 30, 2004
24,442
6
81
Originally posted by: beer
Your largest power consumers are ovens, water heaters, and probably ACs.

dishwashers, driers as well. Though obviously these aren't on as much.

Heating in the winter is HUGE of course.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: David101
aren't power supplies like 70% efficient, so they take more AC power in than their rating? (like 400w instead of the 350 rating)

Yes, they are only ~70% efficient, so even if all the components only draw 250W, the actual draw would be 350w, but that's still probably not enough to up the bill that much.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,402
8,574
126
Originally posted by: jagec
Originally posted by: beer
Your largest power consumers are ovens, water heaters, and probably ACs.

dishwashers, driers as well. Though obviously these aren't on as much.

Heating in the winter is HUGE of course.

not anywhere near as expensive as cooling in the summer, at least here.

hell, you can almost not heat in the winter at all here

good old texas.


the comptuers in my room with heat the room significantly if the door is closed. i suspect the biggest culprit is the 19" trinitrons.
 

StageLeft

No Lifer
Sep 29, 2000
70,150
5
0
the comptuers in my room with heat the room significantly if the door is closed. i suspect the biggest culprit is the 19" trinitrons.
Before I had central air my computer with a 19" monitor used to raise the temp of the room it was in by about 10 degrees :(
 

vegetation

Diamond Member
Feb 21, 2001
4,270
2
0
Buy a "Kill a watt" meter (look on ebay, they are cheapest there) if you want to get the real deal on how many watts something is using. Keep the meter attached all day and it will give you an average reading as well. Takes the bite out of guessing games on how much electricity something is really using. My shuttle cube with a celeron 2.0 uses an idle load of 50w; on full activity it jumps to 90 something watts. My dell P3 laptop uses something really low, like 20 some watts on idle, even with the 15" lcd on. The meter will also allow you to plug in the big gun appliances like refrigerators and air conditioners (115v only). You'll gasp seeing how much power those things take up..
 

jagec

Lifer
Apr 30, 2004
24,442
6
81
Originally posted by: ElFenix

not anywhere near as expensive as cooling in the summer, at least here.

hell, you can almost not heat in the winter at all here

good old texas.

lol....one of the nice things about Seattle is you just need a bit of heating in the winter, and you don't really need AC at all in the summer.
 

Amused

Elite Member
Apr 14, 2001
57,348
19,517
146
Guys, an idle PC uses less than 100w... if that... so long as the monitor is set to turn off automatically.