bobsmith1492
Diamond Member
- Feb 21, 2004
- 3,875
- 3
- 81
You're driving an LED? That changes everything. You must not run an LED directly from a wall wart. You need a resistor to set the current through the LED; otherwise you will either not turn on the LED or burn it out.
LEDs run at a particular voltage, say 1.5V for your LED. You can't just apply 1.5V, though it might work, but you won't know how much current it will take and the brightness will be wildly variable from LED to LED because of differences between LEDs and differences in the output voltage from the supply. Chances are you might still burn out the LED.
What you want to do is the following:
- Get a 3V or 3.3V wall wart
- Decide how much current you want to supply the LED - say 100mA for yours
- Know the LED voltage (say 1.5V)
- Calculate the resistor needed: (Vsupply-Vled)/(Current) where current is in amps
So, for this example, with a 3.3V wall wart, to power the LED, you want (3.3-1.5)/(0.1) = 18 ohms.
Check the LED voltage vs current graph below. See how steep the curve is (on the right of the graph)? If the voltage is a little too low, no current will flow and the LED won't turn on. If the voltage is a little too high, too much current will flow and the LED will burn out. So you need a resistor to control the current (or a constant current driver which is more complicated).
http://upload.wikimedia.org/wikipedia/commons/a/a5/Diode-IV-Curve.svg
LEDs run at a particular voltage, say 1.5V for your LED. You can't just apply 1.5V, though it might work, but you won't know how much current it will take and the brightness will be wildly variable from LED to LED because of differences between LEDs and differences in the output voltage from the supply. Chances are you might still burn out the LED.
What you want to do is the following:
- Get a 3V or 3.3V wall wart
- Decide how much current you want to supply the LED - say 100mA for yours
- Know the LED voltage (say 1.5V)
- Calculate the resistor needed: (Vsupply-Vled)/(Current) where current is in amps
So, for this example, with a 3.3V wall wart, to power the LED, you want (3.3-1.5)/(0.1) = 18 ohms.
Check the LED voltage vs current graph below. See how steep the curve is (on the right of the graph)? If the voltage is a little too low, no current will flow and the LED won't turn on. If the voltage is a little too high, too much current will flow and the LED will burn out. So you need a resistor to control the current (or a constant current driver which is more complicated).
http://upload.wikimedia.org/wikipedia/commons/a/a5/Diode-IV-Curve.svg