Is there a way to tell how much wattage your computer is using at any point in time?

AncientPC

Golden Member
Jan 15, 2001
1,369
0
0
Any way to tell? I mean, I'd like to see how much wattage my system needs.

Another question, how much electricity does an average system take up disregarding monitor? The equivalent of a 20w light bulb or something like that?
 

AWhackWhiteBoy

Golden Member
Mar 3, 2004
1,807
0
0
you can buy a multimeter at a local radioshack and measure the current going into the power supply, then multiply that by the voltage, 120volts.
 

Varun

Golden Member
Aug 18, 2002
1,161
0
0
Just be sure to get one that has a 10Amp fuse. Your house circuit has a 15amp breaker, but there is no way your computer will be pulling 15 amps through the power supply.

BTW unless you have an inductive clamp for that multimeter you have to put the ammeter in series with the supply, meaning cutting up and old cord to measer this.

It's not worth the time really. A good power supply of 300 watts will run an average computer with no problems, so I would think on average with everything running you would be using around 200-250 watts including hard drives and all fans, of course that's a guess. A brand new PC would likely be around 400 watts if it was maxed out and running full.
 

oupei

Senior member
Jun 16, 2003
285
0
0
you'll need a multimeter that measures AC current. dunno about the average multimeter, but mine doesn't do this.
 

Dman877

Platinum Member
Jan 15, 2004
2,707
0
0
What Varun said. You'll have to slice open an AC power cord, cut the white wire and insert an amp meter in series. Most multimeters measure amps, I don't know what oupei is talking about. Anyway, take the amps off the wall, multiply by 115, then multiply by .7 or so, most psu's are about 70% efficient.
 

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
The only way to get an accurate power measurement is to use a 'wattmeter'.

Using a multimeter (or ammeter) to measure mains power carries 2 major problems:
a) it involves attaching wires to mains voltage cables, with the attendant dangers of electric shock and
b) it gives wildly inaccurate results (for a number of reasons)

Some measurements I've made (base units only - monitor not included):

Celeron 766, integrated intel graphics, single 5400 rpm HDD, 256MB - 40 W (idle at windows desktop), 50 W (full load - Quake III)

Athlon 1000 (socket), Geforce 2, single 7200 rpm HDD, 512 MB - 90 W (idle), 120 W (full load)

Athlon 2500+ (overclocked to 2.2 GHz), Radeon 9700, 2x 7200rpm HDD, 1.5 GB, 2x optical, multiple fans - 140 W (idle), 160 W (full load)
 

Zepper

Elite Member
May 1, 2001
18,998
0
0
Yeah, a Watt meter would be the most accurate. Be sure to measure each AC cable that is involved in running your system (computer, monitor, speakers, modem, router, printer) both a idle and with a heavy load including drive activity, etc. Then you will know the range.
.bh.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Zepper
Yeah, a Watt meter would be the most accurate. Be sure to measure each AC cable that is involved in running your system (computer, monitor, speakers, modem, router, printer) both a idle and with a heavy load including drive activity, etc. Then you will know the range.
.bh.

Also keep in mind that power supplies are not 100% efficient; your system is probably only using 60-80% (the cheaper the power supply, the worse this number will be) of the power being pulled from the wall; the rest is dissipated as waste heat from the PSU.
 

oupei

Senior member
Jun 16, 2003
285
0
0
Originally posted by: Dman877
Most multimeters measure amps, I don't know what oupei is talking about.

mine measure amps too, but only DC current. if yours measures AC current, then you're good to go.
 

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
Originally posted by: oupei
Originally posted by: Dman877
Most multimeters measure amps, I don't know what oupei is talking about.

mine measure amps too, but only DC current. if yours measures AC current, then you're good to go.

Unfortunately, you'll probably get a totally useless result.

Even if your multimeter can measure AC current, many are bad at measuring RMS current (which is what counts). Even those that are good at measuring RMS current, may be terrible if the AC current waveform is distorted (PC PSUs generally take a very distorted current waveform).

Then you have the problem of power factor - a distored current waveform means a poor power factor. This means that the current (amps) is much higher than you would expect for the power (watts). It's common to see a PC PSU take 3A at 110V but only actually taking 200 W.

If you have a PSU with PFC and a high quality multimeter with RMS current measurement - then you might get an accurate result. Otherwise, I wouldn't bother.
 

jackschmittusa

Diamond Member
Apr 16, 2003
5,972
1
0
Gee, my old trusty Simpson 260 analog multimeter measrures AC current. Fancy digital ones don't?. Found a wattmeter on ebay with a strarting bid of $20us.
 

MadAd

Senior member
Oct 1, 2000
429
1
81
I used a plugin unit (similar to RalfHutters) to measure my current system below, that includes 4hdds on a cheap 550w psu.

On full tilt, running 3dmark at the same time as copying a 1 gig file from partion to partition, it came up at 220W. On idle displaying just the xp desktop it was a little over 100W.
 

sisooktom

Senior member
Apr 9, 2004
262
0
76
Originally posted by: Mark R
Originally posted by: oupei
Originally posted by: Dman877
Most multimeters measure amps, I don't know what oupei is talking about.

mine measure amps too, but only DC current. if yours measures AC current, then you're good to go.

Unfortunately, you'll probably get a totally useless result.

Even if your multimeter can measure AC current, many are bad at measuring RMS current (which is what counts). Even those that are good at measuring RMS current, may be terrible if the AC current waveform is distorted (PC PSUs generally take a very distorted current waveform).

Then you have the problem of power factor - a distored current waveform means a poor power factor. This means that the current (amps) is much higher than you would expect for the power (watts). It's common to see a PC PSU take 3A at 110V but only actually taking 200 W.

If you have a PSU with PFC and a high quality multimeter with RMS current measurement - then you might get an accurate result. Otherwise, I wouldn't bother.

Exactly. This is the real killjoy for using a normal multimeter to measure this. The vast majority of PC power supplies do not have power factor correction, so your results would be skewed. Of course, if you knew the power factor of the supply, you could calculate it correctly I guess. . .
 
May 10, 2004
136
0
0
What possible use will this information have once you figger it out, if indeed you can ever develop rational, accurate numbers. The cost of power per month doesn't vary much from one computer to another. Only monitors have wide differences. As long as power is regulated bu the power supply, that is where th answer lies. But the answer to what? I don't know.
 

Zepper

Elite Member
May 1, 2001
18,998
0
0
Yes, I hadn't seen that one before either. So I used Froogle and found the Kill A Watt meter for under $30...Here .

.bh.
 

AncientPC

Golden Member
Jan 15, 2001
1,369
0
0
Originally posted by: raybay
What possible use will this information have once you figger it out, if indeed you can ever develop rational, accurate numbers. The cost of power per month doesn't vary much from one computer to another. Only monitors have wide differences. As long as power is regulated bu the power supply, that is where th answer lies. But the answer to what? I don't know.

So how much power does a computer use a month (on 24/7)?

I may end up buying one of those Kill-A-Watt meter for my mom, she's paranoid about us wasting electricity with leaving the computer on.
 

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
Originally posted by: AncientPC

So how much power does a computer use a month (on 24/7)?

I may end up buying one of those Kill-A-Watt meter for my mom, she's paranoid about us wasting electricity with leaving the computer on.

Computers vary greatly in their power consumption.

Base unit consumption may range from 30W (mobile CPU based small-form-factor PC) to 200W (P4 3.6 HT, high-performance gaming graphics, RAID), or even higher for dual CPU workstations/servers. Approximate costs (assuming energy cost of $0.08 / kWh) work out at between $1.80 to about $12/month. Power consumption may be reduced further by energy saving technologies like AMDs 'Cool'n'Quiet' or by allowing the PC to standby when not in use.

Power consumption by the monitor can vary similarly - potentially as low as 30 W for a smallish LCD, or as high as 150W for a big CRT. If power saving is enabled it is very difficult to predict power consumption - but if the monitor is on for about 8 hours per day then costs can be estimated at between $0.5 and $3/month

Other important factors - ATX PSUs and motherboards use a smallish amount of power even when ostensibly off. This 'standby' power can be quite significant - between 15 and 25W for a typical PC. This means that a lightly-used low-end family PC (used about 4 hours/day) actually costs more to keep plugged in while switched off, than it costs while in use.

Other peripherals, e.g. cordless mouse, USB hubs, printers, external drives, modems, scanners, routers/network, speakers all contribute to standby consumption. I analysed the power consumption back at home - in our study, the PC peripherals (modem, LAN hub, USB hub, scanner, printer, speakers, digicam charger, cordless phone, mouse charger, external HDD) accounted for over 10% of the electricity bill for the entire family. (That figure does not include power used by the actual PC!).

The solution to the above problem is to use a power strip; turn on the periphs only when the PC is in use.

Another consideration is dealing with heat from the electronics - all electrical power used by electronics is converted into heat. If the room is air conditioned, then the more electricity the equipment uses, the harder the AC has to work - you may have to pay 1/3 or 1/2 as much again to keep the AC going. If the room is heated, then a reduction can be made to the energy used to heat the room, however, electricity is very expensive compared to heating oil or natural gas, so you can only offset 1/3 to 1/2 the cost.
 

kcbaltz

Member
Apr 10, 2000
98
0
0
I'll chime in with support for the Kill-A-Watt. Not only will it tell you wattage used, it will measure it over time so you can average over a week if you want. Besides computers, it can measure appliances and tell you if your old refrigerator is worth replacing (my piece of junk is using about 3-4x the watts of new models).
 

White Widow

Senior member
Jan 27, 2000
773
0
71
Lab tests done by Dell show that an [average] PC running Microsoft Office uses 42.7 watts, and an [average] monitor uses 75.0W - Flat-panel monitors use less energy (22 watts when left on, 3.3 watts in "sleep" mode). If a typical systems runs continuously at that rate for 365 days, at 7 cents per kilowatt-hour, the power consumption costs would be $26.18 for the PC and $45.99 for a regular monitor, for a total of $72.17 for the system. Putting a system into "Hibernate/Sleep" powers down your monitor to about 5 watts of energy (3.3W for a flat panel) and your PC to 2.3 watts. Screen savers do almost nothing to reduce energy consumption.

So, turn off the monitor, throw your mom a $20, and tell her to relax.
 

vegetation

Diamond Member
Feb 21, 2001
4,270
2
0
Got my Kill a watt meter today and it's a fun toy to finally get scientific results of my power usage, Here's some sample data so far:

Dell Pentium classic optiplex desktop 166MHz with two 7200 rpm ide drives, one 10/100 network card, integrated graphics: 37 watts average no load, 49 watts max.

*Dell Inspiron 8000 Laptop with mobile P3-900, no battery, 4200 rpm drive, some cruddy ATI mobile 32mb 3d card, one external mouse: 23 watts average no load, 45 watts max.

Shuttle Cube Celeron 2GHz Northwood, single 7200rpm drive, integrated graphics, ASUS tv tuner card, single 512mb ram stick: 52 watts average no load, 90 watts max!

Generic Compusa Cable/DSL Wireless router: 7 watts constant

3Com Audrey with LCD on: 8 watts constant

*Note that the Laptop had its 15" LCD on during the test! Other systems did not have any display device operating Edit: with LCD powered off, only 16 watts (hard drive still active)

The celeron system had its wattage fluctuate wildly during partial loads. So if running an active server, you should average the number to be around mid 60ish.

Others who have a meter, please post your results.