• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

How much does electricity cost to run a computer?

KnickNut3

Platinum Member
My family is constantly yelling at me for leaving the computer on for two minutes when I'm not using it, saying I'm wasting electricity. I then see that many people, especially with XP, leave their machines on for days and weeks at a time.

I'm just trying to get a sense of how much it costs to leave a computer running. Let's say about 350W power supply and XP, and a 17" to 19" monitor (seems about right for most of us).

How much would it cost per hour...

When in use?
While idling with monitor on?
While idling with monitor in "power save" (orange light)?
While idling with monitor off?
While downloading lightly with monitor off?
While hibernating/standby?

Thanks for the replies, because I don't have a sense of what it is. These numbers could be anywhere from $0.01/hr to $0.50/hr for all I know. Thanks again.
 
The best way to use a computer is to NOT turn it off and on a lot. The electronics heat and cool over and over, expanding and contracting, and increasing the chance of something breaking.

They don't use much, 350w would be the max it would use, and it never does. Once the machine is up and running it doesn't use much. The moniter uses a fair amount though, and the computer should be set to shut down the video when not in use.

 
Well... I have "Turn off monitor after 20 minutes" checked in power settings, and if I'm going away for more than a half hour I'll turn the power off. If I'm going away for more than two hours, though, I'll shut down. I'm better off leaving it on?
 
Any kind of numbers regarding prices? Sorry, I slept through the physics lesson on watt-hours and can't seem to convert computer use to price 😉. Also, the aformentioned questions regarding the time of off-ness I've been using... Thanks
 
think of it like leaving two lights on in your house

never turn your computer off and run a distributed comuting program to keep it busy and run to snuff
 
FoBoT has it about right. Even if you leave your computer on, if you don't run a DC program, things will heat up every time you do something useful. I've heard that the idle thread/cpu cooler in WinNT-2K-XP has killed some computers that way.

And, if you're going to run DC, it might as well be for the hometown team. 🙂

Edit: Climateprediction.com assumes that a computer uses 50W. Then running it 24/7 will use 50*24*30 = 36 KWh/month. Find out how much your family pays per KWh, and you'll know about how much it costs.
 
I no longer feel guilty as I use my laptop for everything but gaming.


If I had to guess, I'd say my destop (19" CRT) uses about $3-5 electricity a month (actively on for 12 hours a day/So.Cal). And I am guessing this from my absence from Cali over my 3 months in Hawaii (comparing usage with the computer unplugged).
 
Distributed computing allows a group, such as SETI@Home to use the power of the millions of computers on the internet, in order to do the work of a supercomputer without the cost. In order to do this, they need volunteers like you to download their client software, which works on little pieces of the problem (generally called Work Units or WUs) and sends the results back. They get a supercomputer, and you get the above benefit, as well as statistics (which will rapidly become addictive. 🙂)

The teamanandtech.com link seems to be down right now, but if you go over to the Distributed Computing forum, you'll find friendly folks who will be happy (and more articulate) to answer your questions.
 
My parents would get onto me for this when I went home for summers/Christmas holidays- I left my computer on normally (no power saving mode) and just made sure to turn my monitor off and all the lights/fans in my room when I wasn't in there. Even when I was, I usually just used opened the blinds for light during the day and a desklamp at night. They seemed to stop complaining, so try that. 🙂
 
Thanks for the DC info. Is this "work" actually going towards something real (like this "analyzing telescope data"), or is it just a contest for fun? You said it keeps your computer working, which is better for it? Does your computer use up more energy doing this DC stuff than it would just idling? (And don't give me answers just so I'll join your team 😉)
Thanks for the continued help. I'm suprised I've never heard of stuff like this.
 
FWIW, I have 2 computers on 24/7 as well as my TV on for most of the day, Tivo, fridge, etc., and unless I'm running my AC, my electric bill runs around $42/month. Our Kw/h rate here is about average.
 
It doesn't cost that much. Electricity is around $0.13/kW hour in most places. Here, it's $0.15/kW hour and the electricity bill went up like $10.00 a month when I started running my Athlon XP server that is in constant load with a 19" monitor on at least 8 hours a day and off when it's not in direct use.
 
Back
Top