• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

How much electricity does your computer use?

Jzeidenb

Member
Im living on my own for the first time awaiting my first energy bill. I have had the habit of always leaving my computer on. Is this a bad idea now that Mommy and Daddy arent paying the electric bill anymore? I dont know how much energy computers use when theyre idle, does anyone know?
 
At idle, it's negligible, you're using far more energy by leaving the lights on or taking a hot shower instetad of a cold one. Even at load, you're likely using more energy by leaving the lights on (with the possible exception of an all-flourescent house, they use a lot less energy than incandescents)
 
Example:

400 watts at load ( rather extreme ) left on 24/7

400 watts × 24 hours × 30 days in a month = 288000 ÷ 1000 = 288 kWh per month. Say you're charged 9c/kWh, that's 25.92 dollars a month.

http://www.eia.doe.gov/cneaf/e...ty/epm/table5_6_a.html

Obviously an extreme example, most rigs don't draw 400 watts and most of them don't run at full load 24/7 ( unless you run benchmarks year round? )..but as you get the general example, none of that 100 dollars a month crap.
 
Based on the APC Power Chute software, my computer uses ~200W at idle.

If I left it ON all the time, it's like leaving 2x 100W light bulbs for 24 hrs. And no one would do that, right?

Do yourself and your wallet a favor and shut it off when you don't use it.

Or use the hibernation feature, of course.
 
As measured from the outlet with a PowerAngel, 2.0 amps @ bootup, 2.25 amps (270 watts) @ idle in windows, and 2.82 amps (342 watts) loaded & defragging RAID 5 array.
 
Originally posted by: gwai lo
Example:

400 watts at load ( rather extreme ) left on 24/7

400 watts × 24 hours × 30 days in a month = 288000 ÷ 1000 = 288 kWh per month. Say you're charged 9c/kWh, that's 25.92 dollars a month.

http://www.eia.doe.gov/cneaf/e...ty/epm/table5_6_a.html

Obviously an extreme example, most rigs don't draw 400 watts and most of them don't run at full load 24/7 ( unless you run benchmarks year round? )..but as you get the general example, none of that 100 dollars a month crap.


even with this extreme example, if run it for 12 hours or say 8 hours which is still extreme for home users you're talking $13 or $9, not too many home users are running a server 24/7
 
If someone needs the computer to be On all the time coz it need for business then it should be On coz it part of generating the income.

Most young people keep their computer On all the time because they think it is Cool.

Well, cool cost money too. :brokenheart:

So the best thing to do is to try it for one month and Electrical Utility would compute the coolness for you. 😛
 
We have our computers on 24/7, but we also participate in Distributed Computing. We don't leave the machines on for any other reason besides crunching I don't think.
 
Originally posted by: Fullmetal Chocobo
As measured from the outlet with a PowerAngel, 2.0 amps @ bootup, 2.25 amps (270 watts) @ idle in windows, and 2.82 amps (342 watts) loaded & defragging RAID 5 array.

To clarify: You can't just multiply current by voltage in an AC system in order to get real power (in Watts), which is what domestic users are charged for. If you have active PFC (and you would have to, in order to produce results like those), and are not measuring the monitor, then you can just multiply and it's a very close approximation.

Originally posted by: JackMDS
Most young people keep their computer On all the time because they think it is Cool.

No, they do it because it is really very convenient. Having a system ready-to-go anytime you want to look something up or play some music or whatever is handy. Have you never actually had an always-on system?

I really doubt people are inviting their friends over to prove their coolness by running a computer all the time.

@OP: power use depends on a lot of factors. Even just mentioning the processor and video card you are using would make it a lot easier to make an estimate.
 
Originally posted by: JackMDS
If someone needs the computer to be On all the time coz it need for business then it should be On coz it part of generating the income.

Most young people keep their computer On all the time because they think it is Cool.

Well, cool cost money too. :brokenheart:

So the best thing to do is to try it for one month and Electrical Utility would compute the coolness for you. 😛

Sarcasm, eh?
 
Average is around 120 watts idle, and 190 watts under load. I always turn-off the PC if it is not in-use for more than 30 minutes. I have a simple house rule...if PC is idling for more than 1/2 hour, then the user will lose PC for one week.
 
I turn mine on in the morning at first use and off in the evening when I'm done with it. That works for me and saves a bit of money.

.bh.
 
set hd's to spin down, system to drop into suspend mode.
100+watts 24hrs a day/365 days a year is not nothing, esp when its done for no reason. coming out of system suspend mode or hibernation takes seconds.

check hotdeals for deals on a kill-a-watt power meter
i've found laptops use very little using mine. ~25watts average, 1 watt or less in suspend mode.

even old pcs like a athlon 2400+ use 110+watts idle😛

one sad thing is the video card, the newer and more powerful it is, the more energy hog it will be apparently
http://www.tomshardware.com/20...ent_computing_options/
check out the video card section. a nice mid/high end video card will more than wipe out any energy gains you make having a energy efficient cpu.
 
Back
Top