Converting voltage question - please help a geek

Fuelrod

Senior member
Jul 12, 2000
369
0
76
I'm sure this is a stupid question for you EE majors out there but please help a simple minded geek. I have a Red LED that is rated for 1.7 volts (2.4 max) and 20 mA. I would like to tap into the 3.3 voltage line on my power supply to power the LED. Is there a resistor I can buy from Radio Shack to somehow put inline with the LED to drop the voltage or am I in over my head and I should go back to playing with Legos and stay the hell away form electricity. Help.
 

blahblah99

Platinum Member
Oct 10, 2000
2,689
0
0
Voltage drop across the LED is about 2.1V, so you need a resistor to drop 1.2V

Since the led can only sink about 20mA, 1.2V/20mA = 60 Ohms. Pick a resistor greater than 60 ohms but closest to it.
 

cressida

Platinum Member
Sep 10, 2000
2,840
5
81
Originally posted by: blahblah99
Voltage drop across the LED is about 2.1V, so you need a resistor to drop 1.2V

Since the led can only sink about 20mA, 1.2V/20mA = 60 Ohms. Pick a resistor greater than 60 ohms but closest to it.

uhm... would the 3.3 voltage line burn out the LED since it's maximum voltage rating is 2.4? Or do I need to review my electronics? :(
 

CTho9305

Elite Member
Jul 26, 2000
9,214
1
81
Originally posted by: SoloKid
Originally posted by: blahblah99
Voltage drop across the LED is about 2.1V, so you need a resistor to drop 1.2V

Since the led can only sink about 20mA, 1.2V/20mA = 60 Ohms. Pick a resistor greater than 60 ohms but closest to it.

uhm... would the 3.3 voltage line burn out the LED since it's maximum voltage rating is 2.4? Or do I need to review my electronics? :(

V = IR
a diode effectively has almost no resistance but creates a voltage drop, so without a resistor, huge amounts of current will flow if you just use 3.3v

with the resistor, you now have something that limits current, and it wont burn.
 

TLfromAI

Senior member
Jun 22, 2002
379
0
0
Originally posted by: CTho9305
Originally posted by: SoloKid
Originally posted by: blahblah99
Voltage drop across the LED is about 2.1V, so you need a resistor to drop 1.2V

Since the led can only sink about 20mA, 1.2V/20mA = 60 Ohms. Pick a resistor greater than 60 ohms but closest to it.

uhm... would the 3.3 voltage line burn out the LED since it's maximum voltage rating is 2.4? Or do I need to review my electronics? :(

V = IR
a diode effectively has almost no resistance but creates a voltage drop, so without a resistor, huge amounts of current will flow if you just use 3.3v

with the resistor, you now have something that limits current, and it wont burn.

He's right...go EEs!
 

Superdoopercooper

Golden Member
Jan 15, 2001
1,252
0
0
Just a note.... unless your LED says it has a recommended operating current at 20mA.... I would go much less than that. Go for 10mA... .which means you'll double the resistor value someone else above calculated.

The advantage is that with 50% less current, you won't see THAT much difference in brightness (experiement to verify... ), but you'll allow your LED to live longer.

Remember... power = IV. So, you double the current you will double the power that thing has to dissipate. And you're more likely to burn out the p-n junction the higher the constant current is (shorter life span).
 

Mday

Lifer
Oct 14, 1999
18,647
1
81
go to google.

search for "LED resistor calculator" or something similar. =)

O_O

you dont want to tap the 3.3 line. tap the 5V line.

Get as close to 20mA as possible. LEDs are fussy like that. And the voltage, I would recommend 2V for it.
 

CTho9305

Elite Member
Jul 26, 2000
9,214
1
81
Originally posted by: Mday
go to google.

search for "LED resistor calculator" or something similar. =)

O_O

you dont want to tap the 3.3 line. tap the 5V line.

Get as close to 20mA as possible. LEDs are fussy like that. And the voltage, I would recommend 2V for it.

Whats wrong with 3.3? A single LED is an insignificant load....
 

neo4s

Member
Dec 21, 2002
83
0
0
LEDs can be confusing. They will drop about 1.7 volts no matter what voltage you apply to them. LEDs offer no resistance to current, so a resistor in series is always needed to limit current.

Calculate the needed resistance:
resistance = (power supply voltage - 1.7) / 0.015

(the 0.015 is 15 milliamps, 10 to 20 mA is fine)
for 3.3 volts you need a 100 ohm resistor (give or take a few ohms)
for 5 volts you need a 220 ohm resistor

Calculate the needed power disapation of the resistor:
power = (power supply voltage - 1.7) * 0.015

a little 1/4 watt resistor will work fine for 3.3 or 5 volt power supplies

The 5 volt output on the power supply is much easier to get to than the 3.3 volt. Either voltage will work just fine so Id recommend the 5 volt. LEDs have a positive and negative lead, if it dosnt work one way, just turn it around. If the LED gets hot or is glowing a weird color then check your supply voltage - make sure its not conneted to the 12 volt line. Hope this clears everything up.