Power Use with LED lights

JonB

Platinum Member
Oct 10, 1999
2,126
13
81
www.granburychristmaslights.com
LED lighting is getting a lot of press lately. Since LEDs are DC devices, when powered from 100VAC, they must be rectified and voltage regulated to work.

Some LEDs only use a single diode instead of a rectifier bridge and therefore only use half of the AC waveform. What I'm curious about is, how does the standard utility company power meter measure the power usage? If you have a substantial number of LEDs that only use the "positive" side of the waveform, would your meter only measure half the kilowatt usage? If you had LEDs using Full Wave DC, would they cost you twice as much to operate as Half Wave lights?

Understand, I know they would technically use half the power and produce half the light, but I'm interested in the utility billing and how the power meter responds.
 

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
Domestic meters measure only true enery usage: i.e. they multiply the instantaneous voltage by the instantaneous current, to calculate the instantaneous power. They then take the integral w.r.t. time of that, to calculate the total power usage as kilowatt-hours (kWh).

This means that if you converted a full-wave light to half wave, it would produce half the light, use half the energy, and be billed half as much.

-----

Industrial electricity meters are often more complex - they often take into account, not just the actual amount of energy used, but how hard the grid has to work in order to deliver it. They measure 2 things - the actual power used, using the calculation above, and the 'apparent power'.

The aparent power is calculated by measuring the 'average' RMS voltage, and RMS current - and multiplying them. This 'apparent power' is how hard the grid has to work to deliver your electricity. The difference between apparent power and true power is called 'reactive power' (this is a bit of a misnomer, as reactive power carries no power at all - it's a reflection of how difficult it is to supply electricty to your load). The meter then calculates the integral of 'reactive power' as kilo volt-amp-hours reactive (kVAhr). When an industrial user gets their bill, they will be billed for the energy used (kWh) and also any reactive load (kVArh).

Example:

Let's assume we have a full-wave circuit that takes 10 A at 100 V AC. It consumes 1000 W. The meter records 1 kWh for each hour of operation - and because the voltage and current and power are all consistent, the meter records 0 kVAhr. If you run the load for 1000 hours a year - you will receive a bill for 1000 kWh, and 0 kVAhr.

However, if we change that same circuit to half wave, then we cut the power consumption to 500 W. The voltage stays at 100 VAC. The meter records 0.5 kWh for each hour of operation. However, because the half-wave current is not the optimal waveform for an AC system, the RMS current is actually about 7 A (not 5 A) - in other words, if you hooked up an AC current meter, it would read 7 A. The electricity meter does the same thing and records a discrepancy of 0.2 kVAhr for each hour of operation. After 1000 hours of operation -you receive a bill for 500 kWh and 200 kVAhr.

The kVAhrs are cheaper than the kWh - because you have to burn coal/gas/uranium to generate the kWh, but the kVAhrs just reflect how much load the grid has come under.

This is the basis of 'power factor' - if the current waveform doesn't match the voltage waveform in an AC system, the RMS current x RMS voltage is higher than the actual number of Watts. Electronic devices (e.g. PC PSUs) have a very highly distorted current waveform - this means that they load down the grid more than they need to, for the energy they use. For industrial users, e.g. large offices, using PCs with PFC PSUs can reduce electricity bills significantly - not because the PSUs are more efficient, but because they avoid clocking up huge numbers of kVAhr on an industrial meter.
 

smack Down

Diamond Member
Sep 10, 2005
4,507
0
0
Doesn't a half wave rectifier provided the same amount of power (same voltage and current) as a full wave. The only difference is the amount of noise and size of a capacitor used.
 

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
Originally posted by: smack Down
Doesn't a half wave rectifier provided the same amount of power (same voltage and current) as a full wave. The only difference is the amount of noise and size of a capacitor used.

If you use a capacitor, yes. If you don't need a capacitor, then no.

If you do use a capacitor, then the half-wave rectifier will have an even worse power factor than the full-wave rectifier (which is pretty bad anyway).
 

BrownTown

Diamond Member
Dec 1, 2005
5,314
1
0
Why don't they just use 2 LEDs hooked in parellel biased in opposite directions. That way you would have the light on all the time and be using the whole waveform. ITs still gonna be a capacitive load, but at a house level you don't have to worry about that, and at a buisness level you would probably WANT that. Keep in mind that the VAST majority of loads are inductive, so adding capacitive loads to the system will liekly INCREASE the capacity factor of a normal building/factory.
 

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
Originally posted by: BrownTown
Why don't they just use 2 LEDs hooked in parellel biased in opposite directions. That way you would have the light on all the time and be using the whole waveform. ITs still gonna be a capacitive load, but at a house level you don't have to worry about that, and at a buisness level you would probably WANT that. Keep in mind that the VAST majority of loads are inductive, so adding capacitive loads to the system will liekly INCREASE the capacity factor of a normal building/factory.

I'm sure that they would use anti-parallel LED strings. Added to that, they'd probably use a capacitor to limit the current, rather than a resistor (much more efficient that way).

The problem with non-linear devices like LEDs (just like a bridge rectifier feeding a reservoir capacitor) is that they have terrible power factor due to their non-linear nature. Because they generate harmonics, rather than cause a phase shift, their power factor cannot be cancelled by inductive or capacitative loads, and the harmonic currents sum, rather than cancel, in the neutral wire, requiring the neutrals in 2 or 3 phase systems to be oversized. This 'harmonic power factor' is a becoming a major problem - modern fluorescent lights: non-linear, not inductive; PCs and other electronics: non-linear, not capacitative; 'inverter' HVAC systems and variable speed motor drives: non-linear
 

Ipno

Golden Member
Apr 30, 2001
1,047
0
0
LED lighting on AC annoys me for the flicker. Its almost like a strobe light.