If my computer is using 310 watts, does that mean...

grohl

Platinum Member
Jun 27, 2004
2,849
0
76
If my computer is using 310 watts, does that mean in 10 hours it will have used 3.1 kwH?

Trying to tell my parents how much it is costing to keep computer on 24/7.

If the above is true, then .31 x 24 x 30 x 0.134 = about $30 per month.

In TX, 0.134 is the KwH cost for us.

Is my math correct?
 
Dec 10, 2005
27,477
11,806
136
Ooops... misread that first line.

It looks like your math is correct.

Do you have any reason to leave the computer on 24/7?
 

nsafreak

Diamond Member
Oct 16, 2001
7,093
3
81
Are you basing that off the size of your power supply or have you hooked up a meter to it while it's running? If you're basing that off the power supply rating then that is quite likely wrong and a high estimate. That's just the maximum draw for the power supply and I doubt your PC is drawing the maximum at all times. For example my bill for last month was about $41. I have my main PC with a CRT & LCD that has a 410 watt power supply. my server with a 300 watt power supply and my entertainment center with all the stuff I have hooked up to it. Not to mention my kitchen appliances among other things. You're probably looking at less than $30 a month to keep your PC on it's probably closer to half if not less than that.
 

archcommus

Diamond Member
Sep 14, 2003
8,115
0
76
The computer's not using anywhere near that much while idle or just doing simple tasks like browsing the web. My friends' apartments have four computers on almost 24/7 and their whole electric bill is maybe $60/month.
 

grohl

Platinum Member
Jun 27, 2004
2,849
0
76
Originally posted by: nsafreak
Are you basing that off the size of your power supply or have you hooked up a meter to it while it's running? If you're basing that off the power supply rating then that is quite likely wrong and a high estimate. That's just the maximum draw for the power supply and I doubt your PC is drawing the maximum at all times. For example my bill for last month was about $41. I have my main PC with a CRT & LCD that has a 410 watt power supply. my server with a 300 watt power supply and my entertainment center with all the stuff I have hooked up to it. Not to mention my kitchen appliances among other things. You're probably looking at less than $30 a month to keep your PC on it's probably closer to half if not less than that.

Actually yes I just realized I am probably overestimating.

I have a new UPS (thanks Hot Deals) which provides the amount of power going out of the box...but I also have my old 21 inch CRT hooked up to it. So, 310 watts is the computer and monitor - the CPU is at 100% since its folding.
 

archcommus

Diamond Member
Sep 14, 2003
8,115
0
76
Originally posted by: grohl
Originally posted by: nsafreak
Are you basing that off the size of your power supply or have you hooked up a meter to it while it's running? If you're basing that off the power supply rating then that is quite likely wrong and a high estimate. That's just the maximum draw for the power supply and I doubt your PC is drawing the maximum at all times. For example my bill for last month was about $41. I have my main PC with a CRT & LCD that has a 410 watt power supply. my server with a 300 watt power supply and my entertainment center with all the stuff I have hooked up to it. Not to mention my kitchen appliances among other things. You're probably looking at less than $30 a month to keep your PC on it's probably closer to half if not less than that.

Actually yes I just realized I am probably overestimating.

I have a new UPS (thanks Hot Deals) which provides the amount of power going out of the box...but I also have my old 21 inch CRT hooked up to it. So, 310 watts is the computer and monitor - the CPU is at 100% since its folding.
Exactly why I don't fold.
 

nsafreak

Diamond Member
Oct 16, 2001
7,093
3
81
Originally posted by: grohl
Originally posted by: nsafreak
Are you basing that off the size of your power supply or have you hooked up a meter to it while it's running? If you're basing that off the power supply rating then that is quite likely wrong and a high estimate. That's just the maximum draw for the power supply and I doubt your PC is drawing the maximum at all times. For example my bill for last month was about $41. I have my main PC with a CRT & LCD that has a 410 watt power supply. my server with a 300 watt power supply and my entertainment center with all the stuff I have hooked up to it. Not to mention my kitchen appliances among other things. You're probably looking at less than $30 a month to keep your PC on it's probably closer to half if not less than that.

Actually yes I just realized I am probably overestimating.

I have a new UPS (thanks Hot Deals) which provides the amount of power going out of the box...but I also have my old 21 inch CRT hooked up to it. So, 310 watts is the computer and monitor - the CPU is at 100% since its folding.

Even with the monitor it won't likely be $30 a month to keep it running at all times. I'm sure that like most folks you shut off the monitor if you aren't using it or you have the PC set to put the monitor into standby after a certain time period of inactivity. Oh and insofar as the folding that doesn't max out the power supply. You're drawing more power for the CPU but not for the other parts of the PC. I used to do a distributed computing project (RC5-72) at all times and my power bill hasn't changed from when I was cracking to now.
 
Dec 10, 2005
27,477
11,806
136
Originally posted by: archcommus
Originally posted by: grohl
Originally posted by: nsafreak
Are you basing that off the size of your power supply or have you hooked up a meter to it while it's running? If you're basing that off the power supply rating then that is quite likely wrong and a high estimate. That's just the maximum draw for the power supply and I doubt your PC is drawing the maximum at all times. For example my bill for last month was about $41. I have my main PC with a CRT & LCD that has a 410 watt power supply. my server with a 300 watt power supply and my entertainment center with all the stuff I have hooked up to it. Not to mention my kitchen appliances among other things. You're probably looking at less than $30 a month to keep your PC on it's probably closer to half if not less than that.

Actually yes I just realized I am probably overestimating.

I have a new UPS (thanks Hot Deals) which provides the amount of power going out of the box...but I also have my old 21 inch CRT hooked up to it. So, 310 watts is the computer and monitor - the CPU is at 100% since its folding.
Exactly why I don't fold.

I have the computers at home folding, but my parents only turn the machines on when they want to use them.
 

Minjin

Platinum Member
Jan 18, 2003
2,208
1
81
But your monitor shouldn't be on all day. Pick up a Kill-A-Watt device and actually measure how much power you're using. I can tell you that my main computer (monitor excluded) would use barely over 70 watts. Thats A64 3400, two HDDs, 6800, and a couple fans. Since I switched to an X1800XT, I'm up to to 90 watts. Even peak power usage, i.e. playing a game that maxes the CPU and video card, it still uses slightly less than 150w. People WAY overestimate how much power computers use and what size power supplies they need.
 

grohl

Platinum Member
Jun 27, 2004
2,849
0
76
Yes the monitor turns off... thus I agree that I am overestimating.

As far as folding, what do you think the difference would be? With a 3500+ CPU, the overall wattage difference would only be 20-30 watts difference from 0-100% (total guess) which makes the likely monthly difference negligible.
 

spidey07

No Lifer
Aug 4, 2000
65,469
5
76
Your calculations are correct. It costs you 30 bucks a month just for a computer.

Watts*hours/1000 = KwH. Multiply this with your price for electricity.
 

ForumMaster

Diamond Member
Feb 24, 2005
7,792
1
0
my computer doesn't use much (don't have a kilo-watt but the power supply calculator says i run no more then 186W) but i also use a program called CPUIdle. it sends HLT instructions to the 0 Ring inside the OS. it cools the CPU a lot and also uses a lot less power as it constantly tells the CPU to shutdown.
 

nsafreak

Diamond Member
Oct 16, 2001
7,093
3
81
Originally posted by: spidey07
Your calculations are correct. It costs you 30 bucks a month just for a computer.

Watts*hours/1000 = KwH. Multiply this with your price for electricity.


Wrong. I paid $40 for electricity last month and I have the following in my apartment:

PC with 410 watt power supply & 2 monitors
PC with 300 watt power supply
2 routers & a cable modem
27" CRT TV (now a 42" DLP)
A/V receiver
DVD recorder
HD DVR cable box
Refrigerator
Oven
Stove
Microwave
Etc. al

True the appliances & TV aren't on 24/7 but they are used on a regular basis. As I and other folks have stated in the thread he is not using anywhere near the max of his power supply at all times.
 

BoomerD

No Lifer
Feb 26, 2006
65,620
14,005
146
Another vote for the Kill-A-Watt...they'll tell you what the power consumption is now, since you hooked it up, etc..inn watts, Kw, amps..Pretty handy devices.
 

Injury

Lifer
Jul 19, 2004
13,066
2
81
Originally posted by: spidey07
Your calculations are correct. It costs you 30 bucks a month just for a computer.

Watts*hours/1000 = KwH. Multiply this with your price for electricity.



Yeah, but this is only assuming that all of the parts of his system are at full load 24/7


OP: If your parents don't want your computer on 24/7... then turn it off. Don't be a brat. Pay the bill yourself, or do what they want because you live in THEIR house.
 

dartworth

Lifer
Jul 29, 2001
15,200
10
81
http://www.codinghorror.com/blog/archives/000426.html

The first thing you need to know is how much power your computer draws. The best way is to measure the actual power consumption. You'll need a $30 device like the Kill-a-Watt or Seasonic PowerAngel to do this accurately. Once you get one, you'll inevitably go through a phase where you run around your home, measuring the power draw of everything you can plug into a wall socket. For example, I learned this weekend that our 42" plasma television draws between 90 watts (totally black screen) and 270 watts (totally white screen). Based on a little ad-hoc channel surfing with an eye on the Kill-a-Watt's LCD display, the average appears to be around 150 watts for a typical television show or movie.

But I digress. Once you've measured the power draw in watts (or guesstimated the power draw), you'll need to convert that to kilowatt-hours. Here's the kilowatt-hour calculation for my server, which draws ~160 watts:

160 watts * (8,760 hours per year) / 1000 = 1401.6 kilowatt-hours

The other thing you'll need to know is how much you're paying for power in your area. Power here in California is rather expensive and calculated using a byzantine rate structure. According to this recent Mercury News article, the household average for our area is 14.28 cents per kilowatt-hour.

1401.6 kilowatt-hours * 14.28 cents / 100 = $200.15

So leaving my server on is costing me $200 / year, or $16.68 per month. My home theater PC is a bit more frugal at 65 watts. Using the same formulas, that costs me $81 / year or $6.75 per month.

So, how can you reduce the power draw of the PCs you leave on 24/7?

* Configure the hard drives to sleep on inactivity. You can do this via Control Panel, Power, and it's particularly helpful if you have multiple drives in a machine. My server has four hard drives, and they're typically asleep at any given time. That saves a solid 4-5 watts per drive.
* Upgrade to a more efficient power supply. A certain percentage of the input power to your PC is lost as waste during the conversion from wall power to something the PC can use. At typical power loads (~90w), the average power supply efficiency is a disappointing 65%. But the good news is that there's been a lot of recent vendor activity around more efficient power supplies. The Fortron Zen fanless power supply, for example, offers an astonishing 83% efficiency at 90w load! If you upgraded your power supply, you could theoretically drop from 122w @ 65% efficiency to 105w @ 83% efficiency. That's only a savings of $20 per year in this 90w case, but the larger the power usage, the bigger the percentage savings.
* Don't use a high-end video card. I'm not sure this is widely understood now, but after the CPU, the video card is by far the biggest power consumer in a typical PC. It's not uncommon for the typical "mid-range" video card to suck down 20+ watts at idle -- and far more under actual use or gameplay! The worrying number, though, is the idle one. Pay close attention to the video card you use in an "always-on" machine.
* Configure the monitor to sleep on inactivity. This one's kind of a no-brainer, but worth mentioning. A CRT eats about 80 watts, and a LCD of equivalent size less than half that.
* Disconnect peripherals you don't use. Have a server with a CD-ROM you rarely use? Disconnect the power to it. A sound card you don't use? Pull it out. Redundant fans? Disconnect them. That's only a savings of a few watts, but it all adds up.

If you're building a new PC, it's also smart to avoid Intel's Pentium 4 series, as they use substantially more power than their AMD equivalents. Intel's Pentium-M, on the other hand, delivers the best bang for the watt on the market. Although it was originally designed for laptops, it can be retrofitted into desktops.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,389
8,547
126
goddamn, can't people read? his UPS has a meter in it. that number is high, but only because his monitor is on too. so, what we need is a) the draw without the monitor on, and b) how much the monitor is on each day.

now, that won't include the overhead of the UPS, which is a few more watts. but it'll be close.