• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Making a LED Flashlight, need help

Tanked

Senior member
Ok, I need some help with this.

If I had a "SuperBright" LED, rated at 2.1 V and 20 mA, what kind of resistor would I need to hook it up to a pair of Alkaline or Nickel-Metal Hydride Batteries? (Alkaline AA batteries are 1.5 V each while NiMH batteries are 1.2 V.)

Thanks!
 
You could use a voltage regulator to get the right power to the LED.

Cool idea, if you get it going.... pics? 🙂
 
I believe it goes like this:

(1.5V batteries)
V = IR
0.9 = 0.02 R
R = 45ohms

(1.2V batteries)
V = IR
R= 15ohm

-----

The regular setup for a standard (~1.7V) LED is for 5V input power and a 1000ohm resistor with ~1/10 of the current, so with that setup, I'd assume you'd need some fairly heavy duty (large enough to dissipate the heat) resistors. 🙂
 
NaughtyusMaximus, no it is not just a V=IR. However, you get the right general results. You just don't show where it comes from.

The LED works as a resistor too. Generally one would put the resistor in series and get a voltage divider.

When you say that the LED is rated at 2.1V, is that the Vf (Vforward, Vthreshold, or something like that) of the LED? If that is the Vf of the LED, it will not turn on (LEDs are a type of diode) unless it is given that much voltage. Generally 1.2 or 1.5V is not enough to turn on an LED. Now if you put two AA batteries in series for 3V then one would use a current limiting resistor.

[Edit: Some of the flashing LEDs have a Vf of 1.2V.]

Let's say you were using a 3V voltage source and you wanted to pick the resistor for an LED with a Vf of 2.1V:

3V - 2.1V = 0.9V

So we want a 0.9V drop across the resistor. If we ran it at maximum current:

0.9V = 0.020A * R
R = 45 Ohms

With two 1.2V batteries in series one would have 2.4V - 2.1V = 0.3V

0.3V = 0.020A * R
R = 15 Ohms

I would not run the LED at its maximum rated current as that will shorten its lifetime. Try using 15mA:

0.9V = 0.015 * R
R = 60 Ohms

or

0.3V = 0.015 * R
R = 20 Ohms

Now for the wattage of the resistors, W = V * A

W = 0.9 * 0.015 = 0.135
W = 0.3 * 0.015 = 0.0045

1/4 W resistors will work in either case. Those are the thin ones you find at most electronic stores. The thicker ones are 1/2 W. Obviously 1/8 (Most surface mount I believe) and higher than 1/2 exist, but if you go down to your local electronics store you will most likely come across the 1/4 and 1/2. You can always ask what the wattage of the resistor is before purchasing it.
 
I just looked at one of my component catalogs (Jameco) and most LEDs are recommended to operate at 20mA, that is not their maximum. So ignore my comments about running it at 15mA, run it at the recommended 20mA (it will be brighter). Your "rated at" threw me off a little, as generally "rated at" means the maximum, "typical" is what is generally used.
 
Thanks guys! I would post pics, but I don't have a digital camera... 🙁

The 2.1 V that it's "rated at" is what the package says I should run through it, I'm not sure what the maximum is.
 
Originally posted by: TankedThe 2.1 V that it's "rated at" is what the package says I should run through it, I'm not sure what the maximum is.

Generally a maximum current will be used rather than a maximum voltage. The curve for voltage vs. current of an LED is so sharp around Vf that the threshold may be at 2.09 and the maximum might be at 2.11. The current really is the issue as that will be what heats up the LED as the small difference in voltage does not make a significant difference in wattage.
 
Back
Top