Reverse Voltage On LED's

Dznuts007

Senior member
Apr 26, 2000
629
0
0
I'm basically illiterate when it comes to electronics and electrical stuff...was wondering about reverse voltage on LED's. I purchased some 9000 mcd LED's and am planning on making a small night light/desk light with it. The forward voltage is 3.5V and the reverse voltage is 5V. Can someone explain this forward and reverse voltage to me in layman's terms (or shall I say lame man's term)

I have an adapter that has a DC output of 4.5V, but I'm guessing that would be too much for the 3.5V LED's. How do I go about utilizing the reverse voltage so that I can use this 4.5V adapter as a power source to light up the LED's?

dznuts007@yahoo.com
 

Heisenberg

Lifer
Dec 21, 2001
10,621
1
0
LED's will only produce light with a forward voltage. The reverse voltage spec listed is most likely the amount they can take before being destroyed. You should find a way to apply the correct voltage to them.
 

TuxDave

Lifer
Oct 8, 2002
10,571
3
71
are there any other specs like max current or anything? Maybe you can buy a resistor to put in series with the LED... hehe
 

Dznuts007

Senior member
Apr 26, 2000
629
0
0
Darn. I guess I'll have to find a power source that has a max rating of 3.5V then. Anyone know where I can find one of these?
 
Aug 16, 2001
22,510
9
81
Find out how much current they need. Put a resitor in series with the power supply.
For example;

Supply voltage = 5V
LED voltage = 3.5V
Voltage over resistor = 1.5V

If the LED can take 20mA then the resistor value R=1.5V/20mA = 1.5/0.020 = 75 ohms.
The power dissipated in the resistor R = R*I*I = 75*0.02*0.02 = 30mW ---> use a standard 1/4W resistor with 75 ohms. 72 ohms is a standard value so pick that.
 

TuxDave

Lifer
Oct 8, 2002
10,571
3
71
Originally posted by: FrustratedUser
Find out how much current they need. Put a resitor in series with the power supply.
For example;

Supply voltage = 5V
LED voltage = 3.5V
Voltage over resistor = 1.5V

If the LED can take 20mA then the resistor value R=1.5V/20mA = 1.5/0.020 = 75 ohms.
The power dissipated in the resistor R = R*I*I = 75*0.02*0.02 = 30mW ---> use a standard 1/4W resistor with 75 ohms. 72 ohms is a standard value so pick that.

yup... sound advice... EE power!
 

Dznuts007

Senior member
Apr 26, 2000
629
0
0
Thanks for the info. Just one more question.

I got my LED's today. They're actually rated 3.5 V and 50 mA. The power supply I found is 4.5V so according to the formula V = IR,

I would put in the difference of the voltage of the power supply and the LED right? That means that would be 1V.

So, 1 = 0.050 A x R and R would be 20 meaning that I need a 20 ohm resistor? I'm slowly learning the basics in electronics so please bear with me. Graduated as a bio sci major so we barely touched the subject in physics. Thanks!

Oh yeah...I'd like to set up 5 of these or 10 of these in a row, what are the issues I need to know about this? Someone told me that putting them in parallel rather than series would be better so that I won't lose some of the power. Do I still need a resistor for every single LED
 

Dznuts007

Senior member
Apr 26, 2000
629
0
0
Heh heh. Not using it on my computer, although I was thinking of adding some to my case. I'm basically using it to create a small desk lamp or possibly even using it to light up my nano tank that I have set up on my desk.
 

Evadman

Administrator Emeritus<br>Elite Member
Feb 18, 2001
30,990
5
81
Originally posted by: Jmmsbnd007
9,000 mcd? WOW. Those would rock for computers :p Linkage?

That is not a computer LED. it is a small lightbulb for your kitchen :)

For the most part, people tend to use the 12v supply (molex plug for hard drives and such) to power LED's. FEw use the 5v even though it is on the same plug. Go figure.

12v - 3.5v = 8.5v 8.5v/50ma = 8.5/.050 = 170 ohm resister if you use the 12v supply and take the LED to it's limit. I do not know the closest resister available to that one, but for led's take the next higher ohm rating.

For the most part, LED's have a 2.1 forward voltage and a 20ma rating. for these I use a 470 ohm resister which is std. It overdrives the LED though.
 

Jmmsbnd007

Diamond Member
May 29, 2002
3,286
0
0
Originally posted by: Evadman
Originally posted by: Jmmsbnd007
9,000 mcd? WOW. Those would rock for computers :p Linkage?

That is not a computer LED. it is a small lightbulb for your kitchen :)

For the most part, people tend to use the 12v supply (molex plug for hard drives and such) to power LED's. FEw use the 5v even though it is on the same plug. Go figure.

12v - 3.5v = 8.5v 8.5v/50ma = 8.5/.050 = 170 ohm resister if you use the 12v supply and take the LED to it's limit. I do not know the closest resister available to that one, but for led's take the next higher ohm rating.
Imagine the brightness... :D