Is there any way to measure the amount of electricity various appliances are using?

rnmcd

Platinum Member
May 2, 2000
2,507
0
0
Is there any way to measure the amount of electricity various appliances are using? I have an old refrigerator (and some other things) and I would like to know how much power they use. Is there a gauge or other instrument that will measure their power usage?

thanks.
 

silverpig

Lifer
Jul 29, 2001
27,703
12
81
Yes. Touch some of the frayed wires near the motor. If you die, the appliance uses a lethal amount of electricity. If you don't, then it doesn't.

:)
 

bsobel

Moderator Emeritus<br>Elite Member
Dec 9, 2001
13,346
0
0
Sorry to post an actual usefull reply, I know thats breaking the spririt of the thread per the last two posters. I think what you want is one of these: WattsUp

Bill
 

rnmcd

Platinum Member
May 2, 2000
2,507
0
0
Thank you very much Bill. That is what I need to determine the power consumption of some of my appliances that I think are real energy hogs...
 

FlashG

Platinum Member
Dec 23, 1999
2,709
2
0
Step 1. Take a reading from your power meter.
Step 2. Turn on a high-powered electric fan.
Step 3. Try to put your finger between the blades while it is running.
Step 4. Call E911
Step 5. Put bloody shirt in the washer and turn it on before they arrive.
Step 6. 24 hours later, take another reading from the meter and compare the two.
Step 7. Walla, you know how much power washer uses.
Step 8. Repeat as needed.
 

GigaCluster

Golden Member
Aug 12, 2001
1,762
0
0
Another idea is to buy a multimeter (useful in many more situations than WattsUp), then measure the voltage and amperage (separately). Measuring voltage isn't strictly necessary since it's somewhere around 110-120, but for accurate results, do measure it.
Then simply multiply the two numbers together to get wattage.

Volts * Amps = Watts


edit: Just took a good look at that Watts Up page, and I strongly suggest against buying it. First, it costs $100 while a good multimeter costs somewhere around $40. Second, that rip-off is only designed for 120V outlets, so you cannot determine the wattage of any 220V devices. In short, get a multimeter.
 

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
Another idea is to buy a multimeter (useful in many more situations than WattsUp), then measure the voltage and amperage (separately).

This method is very inaccurate, especially for electronic devices (e.g. PCs, TVs), and for devices with motors in them (e.g. washing machines). If used on a PC, it can over-estimate power consumption by as much as 50%. Additionally, for devices which are on intermittantly, e.g. computer monitors, fridges, you get no good measure of duty cycle.

The only accurate way is to use a true energy meter - e.g. something like the 'WattsUp' or the energy meter kindly provided by your electricity supply company (this is how I do it, but it is inconvenient as most of the other appliances have to be switched off during the measurement)
 

bsobel

Moderator Emeritus<br>Elite Member
Dec 9, 2001
13,346
0
0
Just took a good look at that Watts Up page, and I strongly suggest against buying it. First, it costs $100 while a good multimeter costs somewhere around $40. Second, that rip-off is only designed for 120V outlets, so you cannot determine the wattage of any 220V devices. In short, get a multimeter.

First, that price is often much lower (I just searched for anypage that had it to link). Second, the technique you suggest is not appropriate (and in fact is quite dangerous) for many people. Third, the unit is quite automatic, enter your cost per kwh and let it do the calcs overnight/over a week (etc). Not easy to do that (long term measurements) with a meter.

I have one, and I do recommend them.

Bill


 

Draknor

Senior member
Dec 31, 2001
419
0
0
Whoah! This is just what I've been looking for!

I've actually been thinking about doing a power meter as a project, but hey - this little unit likes nice & convenient!

I've always wondered how much power I'm sucking out of the outlet to feed my electronic toys; with something like this I'd be able to know!

And yeah, it's not exactly trivial to measure current while running an appliance - after all, the meter has to be in series, which means you either a) rewire the outlet to put the meter in series, or b) do something funky with the power cord so you get the meter in series. Either way is dangerous (less dangerous if you know what you are doing, but still dangerous).

My thought was to have a little box, UPS-sized or so, that had a power strip, master on/off, surge protection, and gauges or readouts to show voltage, current, and total power simultaneously. I think that kind of stuff would be interesting to know. But then again, I'm a EE... :)
 

GigaCluster

Golden Member
Aug 12, 2001
1,762
0
0
Originally posted by: Mark R
Another idea is to buy a multimeter (useful in many more situations than WattsUp), then measure the voltage and amperage (separately).

This method is very inaccurate, especially for electronic devices (e.g. PCs, TVs), and for devices with motors in them (e.g. washing machines). If used on a PC, it can over-estimate power consumption by as much as 50%. Additionally, for devices which are on intermittantly, e.g. computer monitors, fridges, you get no good measure of duty cycle.

The only accurate way is to use a true energy meter - e.g. something like the 'WattsUp' or the energy meter kindly provided by your electricity supply company (this is how I do it, but it is inconvenient as most of the other appliances have to be switched off during the measurement)

Now I am curious: how is this method inaccurate? I can agree that it may be less safe than a specialized device, but I thought that it was a direct formula -- watts=volts*amps. How would measuring both variables be inaccurate by as much as 50%?
 

xirtam

Diamond Member
Aug 25, 2001
4,693
0
0
The only reason it's inaccurate is because certain devices will consume different amounts of power at different times. Through the course of a duty cycle, it might draw different current levels. But you're right... P=IV.
 

silverpig

Lifer
Jul 29, 2001
27,703
12
81
Basically, he's saying that if you measure the power used by your washer during the rinse cycle, you may think it's very efficient as you will get a much lower power usage than if you took your reading during the spin cycle.

Think of the power usage of an appliance as a curve. It goes up at some times, and down during others. The full cycle of use is one duty cycle (one full load of wash from beginning to end if you like). If you use the multimeter method, you are just taking a single value of the curve and assuming it's constant throughout the entire cycle (riemann sum with only one point). Taking the cumulative power usage over an entire cycle is like taking the integral of the curve, and getting an accurate reading of it's average power usage.
 

OS

Lifer
Oct 11, 1999
15,581
1
76
Originally posted by: GigaCluster
Originally posted by: Mark R
Another idea is to buy a multimeter (useful in many more situations than WattsUp), then measure the voltage and amperage (separately).

This method is very inaccurate, especially for electronic devices (e.g. PCs, TVs), and for devices with motors in them (e.g. washing machines). If used on a PC, it can over-estimate power consumption by as much as 50%. Additionally, for devices which are on intermittantly, e.g. computer monitors, fridges, you get no good measure of duty cycle.

The only accurate way is to use a true energy meter - e.g. something like the 'WattsUp' or the energy meter kindly provided by your electricity supply company (this is how I do it, but it is inconvenient as most of the other appliances have to be switched off during the measurement)

Now I am curious: how is this method inaccurate? I can agree that it may be less safe than a specialized device, but I thought that it was a direct formula -- watts=volts*amps. How would measuring both variables be inaccurate by as much as 50%?


That equation is true only for purely resistive loads. If there is an inductive or a capacitive load in the device (and most real devices do have a reactive component), the current and voltage do not have the same phase.

A complete explanation is much more complicated, I'm an EE major and there are multiple classes devoted to this material.




 

OS

Lifer
Oct 11, 1999
15,581
1
76

As for the original post, a specialized meter is the way to go. Us EE students use wattmeters in our power engineering labs. I've heard that if you contact your electric company, they will loan out an appropriate meter to you.
 

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
The inaccuracy I was talking about is the problem of 'power factor', the differing phases of the voltage and current waveforms, and the generation of high frequency current waveform harmonics. It's a complicated subject and is beyond the scope of discussion here - any good EE text should be able to give references.

However, here is a practical example:

I have a 20 W compact fluorescent light bulb. It runs on the mains supply of 240 V. Using the mains electricity meter I measured the power consumption as 20 +/- 1W. I then used a high precision DMM to measure the voltage and current - voltage was 241 V RMS, and operating current 136 mA RMS. In both cases, the lamp had reached operating temperature for 5 minutes.

The real average power (as measured by the elctricity meter) is very close to the nominal power of the lamp, suggesting that this is an accurate measurement.
The apparent power (voltage x current) is substantially higher (32.7 W) - you are not usually charged for the apparent power, so it is of little relevance to most people.

However, it is important to size wires, fuses and circuit breakers based on the apparent power: e.g. a 5 A lighting circuit is often termed as a 1200 W circuit - however, you would not be able to install 60 of these CFLs on one - because the current would exceed the 5A limit.
 

rnmcd

Platinum Member
May 2, 2000
2,507
0
0
Does anyone have experience and can attest to the accuracy/inaccuracy of the Watt Meter linked two messages ago?

Thanks !
 

schmedy

Senior member
Dec 31, 1999
998
0
76
Depending on where you are I can let you borrow anything you need. I have access to or own dvm's, wattmeters, peak and avg power meters, spectrum analyzers, S paramater test sets....etc. I am in VT, btw.
 

rnmcd

Platinum Member
May 2, 2000
2,507
0
0
schmedy- What would you recommend that I use? I have some old appliances that I want to determine whether they are worth keeping (or if I should sell them if they draw too much electricity).

How could I measure the amount of watts a central air conditioner uses over a 3 or four day period?

Thank you.

 

schmedy

Senior member
Dec 31, 1999
998
0
76
You safest bet, for not killing yourself, would be using a DVM and a clip on amp probe for it so it reads it thru inductance instead of actually punching thru the insulation or working in the breaker box. As for duty cycle you can test it while just blowing and then while the ac is putting out cold air too and then you will have 2 different ratings so you can get a rough avg. To do a true avg you need to watch it for the peak as it turns on thats when it will draw the most current, but you just want a rough idea so this should work. You can use this method for any of the things you want to test really. As someone said earlier you may be able to call the power company and tell them you have some older appliances and would like them tested for efficiency and they may do it for you. Also I know here in VT they will give you an incentive to either switch some old appliances even water heaters and furnaces out, but you need to call the local power company for that. I can explain more how to do this if you want, but I don't agree with the intrusion method of testing for someone not used to dealing with electricity, hate to see you get hurt.