LED Lights

bangleon

Junior Member
Jul 7, 2010
1
0
0
Hi I need help with some LED lights I'm trying to install. I'm building a HO gauge gas station and thought I would add some lights to it. I'm using white 5mm from radio shack connected to a cr2032 battery and a 125vac rocker on/off switch. It work fine when I have everything connected and run the lights for about 20 minutes. But the next day when I try the lights the batterys are completely dead. What a'm I missing?


EDIT:
Moved to highly technical
AT Mod
Gillbot
 
Last edited by a moderator:

PottedMeat

Lifer
Apr 17, 2002
12,363
475
126
The load from a white LED with no resistor in series connected directly to a CR2032 battery probably starts at ~15-20mA at full charge and decreases as the battery is discharged.

http://data.energizer.com/PDFs/cr2032.pdf

At that rate you're going to kill the battery fairly quickly. Either stick a resistor ( maybe in the 220-1000 ohm range until you find an acceptable brightness ) in series to limit the current or use 2AA batteries instead ( these have far higher capacity than little button cells ).
 

bobdole369

Diamond Member
Dec 15, 2004
4,504
2
0
The short answer is that you've killed the battery. CR2032 has about that much power in it.

Use a bigger cell or a resistor to limit current.
 

imagoon

Diamond Member
Feb 19, 2003
5,199
0
0
It would likely be worth it to add a 5VDC or 12VDC supply line to your HO scale system. 1 amp of 12V power can run a ton of LEDs and is easily available. Just make sure you use resistors.
 

MrDudeMan

Lifer
Jan 15, 2001
15,069
94
91
You always need a resistor or a current source for a led.

In this thread you will find lot's of useful information...

http://forums.anandtech.com/showthread.php?t=2038316&highlight=leds&page=2

You don't always need a resistor. It completely depends on the voltage source. You just locate the desired current on the diode curve, which can usually be found for most LEDs, and provide the corresponding voltage. You only need a resistor when you need to burn off excess power because you can't pick your voltage. I know it doesn't apply to this particular scenario since he is using off the shelf components, but blanket statements can confuse people.

OP: I can't tell by your original description if you have more components connected to your battery, but I'm going to assume the answer is yes. Based on the white LEDs I found at Radio Shack, they are probably expecting 3.3V and will draw 25mA at that voltage. Since your battery is supplying 3.3V, the LED will be drawing 25mA from it continuously, but a CR2032 usually has 225mAh of capacity. That means you can draw 225mA for 1 hour, 22.5mA for 10 hours, 2.25mA for 100 hours, etc. before the energy in the battery is completely exhausted. In your case, the LED by itself would drain the battery on its own after just 7-10 hours (realistically even less time since your LED curve is so non-linear in this region and could potentially be drawing more current). If you have other devices attached to the battery, it could die very quickly, which is what you seem to be experiencing.
 
Last edited:

Raghu

Senior member
Aug 28, 2004
397
1
81
^^ Paper engineer? Good luck locating the voltage on the steep I-V curve of a LED and then building/finding a stable source at that precise voltage. And when there is a 3% variation in the voltage curve for every other LED, you will suddenly find yourself pushing twice the intended current, or wondering why some dont light up at all.
 

MrDudeMan

Lifer
Jan 15, 2001
15,069
94
91
^^ Paper engineer? Good luck locating the voltage on the steep I-V curve of a LED and then building/finding a stable source at that precise voltage. And when there is a 3% variation in the voltage curve for every other LED, you will suddenly find yourself pushing twice the intended current, or wondering why some dont light up at all.

That's quite a rude remark considering what I said is correct.

As far as being a paper engineer, I'm not only working with variation across diodes, but thousands of FETs on output drivers where everything is statistical across several process corners and still must maintain strict specifications.

Also, I've designed several embedded systems using white LEDs running at 3.25V without a resistor. Some have been running 24/7 for several years with no failures. My comment was in response to an incorrect statement. Just because it isn't normally done without a resistor doesn't mean it is always done that way.

As an example:
My last product had no flexibility in terms of power supply voltage. I had 5V DC and that was it. There were two critical concerns: heat and power. The device had 10 LEDs with a Vf/If of 3.5V/20mA. I chose to operate them at 3.25V which would put the current draw at 16mA. The minimum required brightness was 2000mcd, which these LEDs could produce at 14.2mA. I DC/DC converted 5V to 3.25V and supervised it with a microcontroller and 1.2V reference. The previous design simply used the supply 5V and a resistor to lower the operating voltage to 3.5V, which worked but consumed a lot more power. Here is some paper engineering for you:

On the 5V supply and using a resistor, the current draw was 20mA. The total power consumption was 10 LEDs * 5V * 20mA = 1W.

My design used a 95% efficient regulator and lowered the operating point to 3.25V which reduced the current draw to 16mA. The total power consumption was (10 LEDs * 3.25V * 16mA) / 95% = 547mW. A 5% positive variation between LEDs would not exceed the current or power ratings and a 5% negative variation would still meet the minimum required brightness.

Just as an academic point, I'll normalize the results. If the original design had limited the current to 16mA, the total power consumption would have been 10 LEDs * 5V * 16mA = 800mW. The resistors waste a non-trivial amount of power in either case. The application for this device is very sensitive to power requirements. Saving almost 500mW at the expense of slightly more area and cost was well worth it to the customer. In this case, as I tried to make clear in my original statement, a resistor was not necessary and actually a disadvantage.
 
Last edited:

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
Meh. It's true a resistor - in the region of 10-20 Ohms is about right for a white LED and a 3 V battery - is recommended to ensure the LED operates in its current range.

It's a good thing then that a CR2032 has an internal resistance of 15 Ohms.

Back to the question. If I understand rightly, the LED works for 20 minutes - until switched off. But when switched back on again, the battery is dead. Is that right? If that is the question, then you need to check carefully whether you have connected the switch correctly. It needs to be connected in series with the battery and LED.
 

imagoon

Diamond Member
Feb 19, 2003
5,199
0
0
It also depends on the LED (some are internally impeded.) Some are basically electrical shorts so if you have 3.25v source that could supply 50 amps, the LED may try to sink the 50 amps (and burn out.) LEDs run ok from Lithium cells due to the batteries internal resistance. Lithium cells cannot deliver their charge very fast at all. They tend to excel at low current requirement. A stack of 4 lithium cells for about 13.2 vdc attached to LEDs will have an different result than a 13.2 volt lead acid battery.
 

MrDudeMan

Lifer
Jan 15, 2001
15,069
94
91
It also depends on the LED (some are internally impeded.) Some are basically electrical shorts so if you have 3.25v source that could supply 50 amps, the LED may try to sink the 50 amps (and burn out.) LEDs run ok from Lithium cells due to the batteries internal resistance. Lithium cells cannot deliver their charge very fast at all. They tend to excel at low current requirement. A stack of 4 lithium cells for about 13.2 vdc attached to LEDs will have an different result than a 13.2 volt lead acid battery.

This is exactly why I even made a comment in the first place. Saying you always need a resistor is not true. It depends on a lot of things. I specifically picked the LEDs for my application because they had desirable characteristics in terms of current vs. brightness. I just don't like blanket statements. For a hobbyist, sure, a resistor is recommended, but that distinction needs to be made otherwise people will start saying "a resistor is always necessary" even if they don't really understand why.
 
Last edited:

Cogman

Lifer
Sep 19, 2000
10,286
145
106
Question: How hard would it be to add a fast "blinker" to the LED?

I ask this because one teacher pointed out that a common energy saving trick for lights that humans are intended to see is to "blink" the light. (He was playing with a laser pointer during a student presentation, I noticed, asked what he was doing... We were talking about op amps.. :D). If you put a 120Hz pulse on the light wouldn't that significantly reduce the power requirement while still looking just as bright?
 

imagoon

Diamond Member
Feb 19, 2003
5,199
0
0
Not very. 555 timer, couple of resisters and a cap can do it pretty easily. Assuming the IC doesn't draw more power than you save.
 

Cogman

Lifer
Sep 19, 2000
10,286
145
106
Not very. 555 timer, couple of resisters and a cap can do it pretty easily. Assuming the IC doesn't draw more power than you save.

:) true, He (the professor) was playing with a fairly bright green laser pointer. I imagine they probably suck up quite a bit of juice.
 

MrDudeMan

Lifer
Jan 15, 2001
15,069
94
91
Question: How hard would it be to add a fast "blinker" to the LED?

I ask this because one teacher pointed out that a common energy saving trick for lights that humans are intended to see is to "blink" the light. (He was playing with a laser pointer during a student presentation, I noticed, asked what he was doing... We were talking about op amps.. :D). If you put a 120Hz pulse on the light wouldn't that significantly reduce the power requirement while still looking just as bright?

It won't be just as bright. Your eye will have trouble seeing subtle differences in brightness, but it will in fact be dimmer if you reduce the frequency.
 

Paperdoc

Platinum Member
Aug 17, 2006
2,499
374
126
Question: How hard would it be to add a fast "blinker" to the LED?

I ask this because one teacher pointed out that a common energy saving trick for lights that humans are intended to see is to "blink" the light. (He was playing with a laser pointer during a student presentation, I noticed, asked what he was doing... We were talking about op amps.. :D). If you put a 120Hz pulse on the light wouldn't that significantly reduce the power requirement while still looking just as bright?

The way a "blinker" works is that it rapidly switches the load on and off. When On, the LED load consumes power and generates light output normally. When Off it consumes no power and generates no light. Suppose, for example, you use a common "square wave" so that the LED's are "On" 50% of the time. Compared to "always On", this consumes only half the power, and generates only half the light output. The trick is that our eyes are so adaptable that we hardly notice the lower light output. So it seems at first glance that we have cut power consumption in half at no cost whatsoever. Not so!

By the way, the reason for doing this at, say, 120 Hz is that the human eye can't see changes in light intensity any faster than about 25 to 30 Hz. That is the basis for movies (24 fps) and TV ( 30 full frames per second, done as 60 half-frames per second). We are so heavily dependent on what we see that a flicker we don't see does not "exist"!

OP's original problem, as others have said, is the choice of a very small battery whose total stored energy capacity is consumed by the LED's in a short time.
 

Cogman

Lifer
Sep 19, 2000
10,286
145
106
The way a "blinker" works is that it rapidly switches the load on and off. When On, the LED load consumes power and generates light output normally. When Off it consumes no power and generates no light. Suppose, for example, you use a common "square wave" so that the LED's are "On" 50% of the time. Compared to "always On", this consumes only half the power, and generates only half the light output. The trick is that our eyes are so adaptable that we hardly notice the lower light output. So it seems at first glance that we have cut power consumption in half at no cost whatsoever. Not so!

By the way, the reason for doing this at, say, 120 Hz is that the human eye can't see changes in light intensity any faster than about 25 to 30 Hz. That is the basis for movies (24 fps) and TV ( 30 full frames per second, done as 60 half-frames per second). We are so heavily dependent on what we see that a flicker we don't see does not "exist"!

OP's original problem, as others have said, is the choice of a very small battery whose total stored energy capacity is consumed by the LED's in a short time.
The reason for using 120Hz is that lower frequencies such as 60Hz are somewhat perceptible to the human eye. Some people get head-aches from the 60Hz pulse of CFLs and Florescent lights.

I know the principle of how a pulsing IC works. What I was wondering is if there would be significant energy savings when a LED already uses so little power. Unless the OP is using the LEDs for some sort of hardware communication (not likely) pulsating them seams like it would make sense.

BTW, for the interested, I found that some low power IC timers consume about 60uW. seems worth it to me if it is able to cut a 20mW power consumption in half.
 

Cogman

Lifer
Sep 19, 2000
10,286
145
106
It won't be just as bright. Your eye will have trouble seeing subtle differences in brightness, but it will in fact be dimmer if you reduce the frequency.

Not true. Generally, a very high frequency results in the dimmer light, that is because the bulb doesn't respond as fast enough. if the pulse is lower then the lights response time, and high enough to fool the human eye, the light will be the same intensity as if there was no pulse.
 
May 11, 2008
22,566
1,472
126
You don't always need a resistor. It completely depends on the voltage source. You just locate the desired current on the diode curve, which can usually be found for most LEDs, and provide the corresponding voltage. You only need a resistor when you need to burn off excess power because you can't pick your voltage. I know it doesn't apply to this particular scenario since he is using off the shelf components, but blanket statements can confuse people.

A led must be driven from a current source, period.
If you do not comply with this, you simply draw more current then you really need to. Efficiency goes down and the led emits less light. When you draw more current then you need to, the led heats up. But the led will emit also after the optimum amount of current less light because of the temperature increase. And the current the led draws is dependent on the temperature as well because the needed forward voltage decreases when the temperature rises.


MarlFig1.jpg


http://www.ledsmagazine.com/features/4/8/1

Driving LED light sources
LEDs are semiconductors with light-emitting junctions designed to use low-voltage, constant current DC power to produce light. LEDs have polarity and, therefore, current only flows in one direction. Driving LEDs is relatively simple and, unlike fluorescent or discharge lamps, they do not require an ignition voltage to start. Too little current and voltage will result in little or no light, and too much current and voltage can damage the light-emitting junction of the LED diode.

A typical LED forward voltage vs. forward current profile is given in Figure 1. It can be seen that, for a given temperature, a small change in forward voltage produces a disproportionately large change in forward current. In addition, the forward voltage required to achieve a desired light output can vary with LED die size, LED die material, LED die lot variations, and temperature.


As LEDs heat up, the forward voltage drops (Figure 2) and the current passing through the LED increases. The increased current generates additional heating of the junction. If nothing limits the current, the junction will fail due to the heat. This phenomenon is referred to as thermal runaway.

By driving LED light sources with a regulated constant-current power supply the light output variation and lifetime issues resulting from voltage variation and voltage changes can be eliminated. Therefore, constant current drivers are generally recommended for powering LED light sources.
 

Cogman

Lifer
Sep 19, 2000
10,286
145
106
A led must be driven from a current source, period.
If you do not comply with this, you simply draw more current then you really need to. Efficiency goes down and the led emits less light. When you draw more current then you need to, the led heats up. But the led will emit also after the optimum amount of current less light because of the temperature increase. And the current the led draws is dependent on the temperature as well because the needed forward voltage decreases when the temperature rises.


MarlFig1.jpg


http://www.ledsmagazine.com/features/4/8/1
I think he was referring to the internal resistance of a voltage source. Even though the diode will try to draw more current, there is only so much a given voltage source can put out. If that voltage source is a battery, then you could conceivably go without putting in a resistor.
 
May 11, 2008
22,566
1,472
126
I think he was referring to the internal resistance of a voltage source. Even though the diode will try to draw more current, there is only so much a given voltage source can put out. If that voltage source is a battery, then you could conceivably go without putting in a resistor.

From that perspective, i give you and him the benefit of the doubt. ^_^
But in that scenario, efficiency would be non existent as the OP has clearly experienced. As the series resistance is internal (and probably not constant either) and is as such fixed and can not be chosen for highest efficiency possible when compared with an external resistor (efficiency is already very low for a resistor with a led). That is why Pulse Width Modulation current sources for power led's are so popular.
 
Last edited:

MrDudeMan

Lifer
Jan 15, 2001
15,069
94
91
From that perspective, i give you and him the benefit of the doubt.
But in that scenario, efficiency would be non existent as the OP has clearly experienced. As the series resistance is fixed and can not be chosen for highest efficiency possible with a resistor (efficiency is already very low for a resistor with a led). That is why Pulse Width Modulation current sources for power led's are so popular.

Seriously you have no idea what you are talking about. I'm not even going to bother pointing out all of the misinformation in your other post, but it would behoove you to read a book sometime and actually put it into practice so you can learn. It's getting old - the google search and copy/paste from the first link you run across in every thread you post in.
 
May 11, 2008
22,566
1,472
126
This is exactly why I even made a comment in the first place. Saying you always need a resistor is not true. It depends on a lot of things. I specifically picked the LEDs for my application because they had desirable characteristics in terms of current vs. brightness. I just don't like blanket statements. For a hobbyist, sure, a resistor is recommended, but that distinction needs to be made otherwise people will start saying "a resistor is always necessary" even if they don't really understand why.

Your blanket statement is based on your lack of understanding how leds operate.
 
May 11, 2008
22,566
1,472
126
Seriously you have no idea what you are talking about. I'm not even going to bother pointing out all of the misinformation in your other post, but it would behoove you to read a book sometime and actually put it into practice so you can learn. It's getting old - the google search and copy/paste from the first link you run across in every thread you post in.

Keep on dreaming. I will just rely on physics that is actually proven.