How much of a burden is a PC on an electric bill?

MetalMat

Diamond Member
Jun 14, 2004
9,687
36
91
Out of curiosity, if you were to keep a computer on for a month what kind of wattage would it use. Lets just assume a normal computer used at an office (2.6Ghz IBM for example). Also, for right now assume there is no monitor attached to it.
 

tfinch2

Lifer
Feb 3, 2004
22,114
1
0
WTF? It should be calculated with a monitor. I don't know many people who don't use a HOME PC without a monitor. Maybe a file server but their main pc with no monitor...okay...
 

V00D00

Golden Member
May 25, 2003
1,834
0
0
Seems like the monitor would be consuming just about as much as the box for most people.

I don't think you can generalize it because it depends a lot on your setup.
 
Nov 17, 2004
911
0
0
For my area, if I'm running a 400watt something 24/7 it adds around 25 bucks a month to my bill. Adjust accordingly. If you really really want to know, look at your electric bill and it'll tell you the price you pay per kilowatt hour, then just bust out some of your sick math skillz.
 

0roo0roo

No Lifer
Sep 21, 2002
64,795
84
91
depends... and power difference between a lcd and crt is rather large too. then theres psu efficiency...bleh. you can buy a device called kill a watt for measurement.
 

vegetation

Diamond Member
Feb 21, 2001
4,270
2
0
Let's assume with your P4-2.6 that you have one hard drive, a reasonably efficient power supply, normal video card, and the system idles most of the day/night but doesn't shut down since you got some intermittent app that needs to run (ftp/web server or whatver). That's about 50-80 watts of average power consumption. Figure out your local electric kilowatts per hour price tag and you can determine what you'll actually be paying.

Monitor would probably not consume as much overall power as you'd think, since it's unlikely to be running 24 hours -- maybe just a few hours per day and the rest it's in standby mode (usually 2-3 watts). It's the 24 hour period of a device is what really can ring up your electric bill.
 

drifter106

Golden Member
Mar 14, 2004
1,261
57
91
a watt is a watt...i figure if i have 4 lights on in the house and they are running at 100 watts each that would be similiar to a psu rated at 400watts (while in use)

of course I have no quanitive data to support this hypothesis sitting in front of me right now
 

biostud

Lifer
Feb 27, 2003
19,678
6,764
136
Depends on if it's crunching data all the time or it's just siting there idle. Also depends on video card number of harddrives etc.
But for an office PC I would think 120-160W, then just multiply with hours and your price for Wh.
 

bdmst16

Member
Oct 22, 2004
31
0
0
Found this, might help:

"Just my $0.02 on the leaving it on/off issue.
Using a clamp-on AC ammeter, I have just measured the power consumption on my "typical" PC and have calculated the actual electricity cost of leaving it on. System is homemade, ECS P4S5A/DX+ mobo, Intel Pentium-IV CPU @ 2.2GHz, 1 Gig ram, 120 GB HDD, has 2 case fans and CPU fan. Monitor is DELL P990, a 19" CRT manufactured in 1999. (A LCD monitor should consume far less)

Computer at "idle" (no applications running, CPU usage near 0%): 0.7 AMPS

Computer at 100% CPU: 1.23 AMPS

Monitor: 0.75 AMPS "ON", goes to near zero in "STANDBY"

My local cost for electricity is just under $0.10 per kilowatt hour, probably lower than many areas. So, in my case, an "idle" computer with monitor on uses 1.45 AMPS.
1.45 AMPS times 120 Volts = 174 Watts, or .174 Kilowatts.

.174KW times $0.10 = $0.0174 per hour

$0.0174 times 24 hours x 30 days = $12.53 per month.

I have five members in my household, each with their own computer, so the costs can really add up. Just leaving all systems and monitors on 24x7 would cost approx. $62 per month. If each computer can be shut down for eight hours per day, that should save one-third, or about $21 per month."
 

Jeff7

Lifer
Jan 4, 2001
41,596
19
81
Originally posted by: drifter106
a watt is a watt...i figure if i have 4 lights on in the house and they are running at 100 watts each that would be similiar to a psu rated at 400watts (while in use)

of course I have no quanitive data to support this hypothesis sitting in front of me right now


Assuming that the PSU is drawing 400W, then that would be right. But just because it's rated 400W doesn't mean it'll draw that, as it won't necessarily be ouputting its rated wattage. Conversely, if it is in fact putting out 400W, it'll actually be drawing more than 400W from the wall, due to conversion inefficiencies.

For the record, my 430W Antec PSU is currently using 215watts.
 

drifter106

Golden Member
Mar 14, 2004
1,261
57
91
thank you jeff....i will probably be more effecient at work today now that I know that....although there will be times when I will not be working at maximum efficiency as is the case right now because i am sitting on my ass typing this reply when i should be getting ready for work!!!

have a great day!!!!

jd
 

chocoruacal

Golden Member
Nov 12, 2002
1,197
0
0
Originally posted by: tfinch2
WTF? It should be calculated with a monitor. I don't know many people who don't use a HOME PC without a monitor. Maybe a file server but their main pc with no monitor...okay...

Earth to tfinch: I would guess that plenty of people leave their PC's on 24/7 but set the monitors to turn off after a set time.

I leave all of my comps on 24/7. Back in my apt. days, I can't say that I ever noticed any significant increase in the E bill between having no comps running, one comp running, or several comps running. You could probably make up the difference by firing up the microwave 1 or 2 times less per month :D
 

Hikari

Senior member
Jan 8, 2002
530
0
0
They don't seem to do much to my bill. I have energy saving turned on for everything at home, so I guess they are sleeping 16 hours a day or more anyway.