• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Electricity Questions!! (how it works)

StageLeft

No Lifer
I am aiming to rig up some LEDs on an RC plane and as such have to get back to understanding the basics of electricity. Imagine, I'm not even being paid or made to do this!

I do not understand current/voltage properly. I know that:

Power(P) = Voltage(V) * Current(I)
and
I = V/Resistance(R) <- Ohm's law

So, a 60 Watt lightbulb running at 120 Volts is also going at .5 Amps (60=120*.5).

First question: What dictates a lightbulb's wattage? A house lightbulb has a fixed 120 Volts, and since its power (watts) are based on the volts and current, a lightbulb must be setting its own current levels so that when the fixed 120 V is put through it, its power of 60 W is figured. Now, Ohm's law says that if we know its Voltage (120) and its current (.5), V/I = R and its resistance would be 240 Ohms. If this is the case, the only difference between a 60W and 120W lightbulb would be that when the 120 W bulb was built, it was designed with a lower resistance (120 ohms) through choosing its wire materials/thickness of wires, etc., right? Does this new 120 W lightbulb have a 120 Ohm resistance only at 120 W or is that a hard set-in-stone resistance? I presume the latter, since radioshack sells resistors that have ohm ratings and do not seem to say what voltage those ohms are at.

Second question: Is a lightbulb or an LED or anything else of similar simpleness pre-set in its current and always will "ask" out of an electric circuit for that amount of current? In other words, the 60W bulb will always be asking for .5Amps and the 120W will always ask for 1 A so that if I took a 60W bulb and shot 240 volts through it, it would now have 120 watts going through it? OR, is it always pre-set in its resistance, so a 60W bulb knows it's 60W because its maker knew it would have 120 V and gave it an appropriate resistance to create the right current that would make 60 W? In this way, a 60 W lightbulb at 120 V would end up being 240 W at 240 volts because a doubling in volts also doubles its current, so now instead of .5A and 120 V for 60 W, it's 1 A (240 V/240 Ohms) and 240 V, so 240 W.

Third question: In the case of an LED like this (ledshoppe<- replace stars with the word 'ledshoppe') would I treat its internal resistance as effectively 0 and that's how I would change its current (since it talks on there about the thing's continuous and max current)? So, if I hooked a 4 V battery directly up to the LED it may immediately with its low resistance, ask for a ton of power and burn itself up right away, which would mean that for me to get the correct resistance and play with those current figures, I could get the power I want?

Fourth Question: Can resistors be run in series? I presume so; 120 ohm + 120 ohm = 240 ohm

Fifth Question: Can LEDs take unlimited voltage? I know they won't run below their forward voltage, but if I have a typical LED and run a million volts through it, will it work as long as I have a miniscule current (so, vast amounts of resistance) so that the power is still in the range it likes?

Thanks! I knew this was the place to come to. I can only hope I'm not actually as stupid as I appear and that perhaps in the past I understood this inately, but I won't rely on that hope 🙂
 
Resistance is set. You give it a voltage, it grabs the current the resistance dictates. Don't know LED stuff.
 
First question: Yes the higher wattage bulb has lower resistance, and therefore draws more current, and yes resistance should be set in stone regardless of voltage... i'm sure there are exceptions to that last bit though.

Second question:
OR, is it always pre-set in its resistance
Yes. It 'draws' current based on the voltage and resistance.

Third question: No it's resistance is not zero. If it was zero it wouldn't be taking any energy from the electricity to make light, right? It is quite low though. You can blow it up but 4v won't do it. In fact 4v is about right.

Fourth Question: Yes

Fifth Question: Hell no.

 
#1 - Yes, the wattage is set by the manufacturer's choice of filament.
#2 - Yes - has fixed resistance (see #1). If you feed it double its design voltage it will allow double the design current to flow through, consuming four times the wattage - but only briefly!! The filament has to get rid of all the light and heat generated as it operates. It will never be able to radiate four times as much heat as its design, so it will get so hot it will melt and "burn out" quickly.
#3 - No, a LED has an internal resistance, too, but it applies only when the forward voltage exceeds the threshold. (Below a minimum, no current will flow.) I think they operate as a linear current vs voltage device if you take the "voltage" as the amount of applied voltage IN EXCESS of the minimum forward voltage. The specs should specify both a max voltage and a max current. Exceeding these will burn it out, similar to a light bulb.
#4 - Definitley yes. The rule is that the total resistance of several resistances in series is simply the sum of the individual resistances. If you place them in parallel, a VERY different rule applies, but it all comes out of the understnding of current distribution in a network.
#5 - See #3. How did you think you could run a million volts through a device and limit its current to miniscule at the same time?

MAYBE related to #5: In many simple circuits the way to use a LED as an indicator light is to place it IN SERIES with a resistor across the voltage source. The voltage may be significantly higher than the LED's proper forward voltage. But given the effective resistance of the LED, you do a calculation. Suppose the LED says it can take up to 3.0v, and its max forward current is 30 mA, or 0.030A. Now, you want to put it across a supply voltage of 12 v to veryify the supply is on. The series resistor will have flowing through it EXACTLY the same current as the LED (0.030A max, remember?) And in order for the LED to have 3.0 v max across it, the resistor must drop the rest (12 - 3 = 9v) across it. The Resistor value must be V/I = 9/0.030 = 300ohms. Now, that's running the LED at full power, and it may be quite bright. You could choose to put in a higher-value resistor. That way the LED will still operate, but be less bright and less likely to fail in 5 years or so. You just can't put in a resistor so high that it reduces the LED's share of the voltage to less than its minimum - then the ciruit becomes really wierd.
 
Originally posted by: Skoorb
First question: What dictates a lightbulb's wattage? A house lightbulb has a fixed 120 Volts, and since its power (watts) are based on the volts and current, a lightbulb must be setting its own current levels so that when the fixed 120 V is put through it, its power of 60 W is figured. Now, Ohm's law says that if we know its Voltage (120) and its current (.5), V/I = R and its resistance would be 240 Ohms. If this is the case, the only difference between a 60W and 120W lightbulb would be that when the 120 W bulb was built, it was designed with a lower resistance (120 ohms) through choosing its wire materials/thickness of wires, etc., right?

That's exactly right. The mains voltage is fixed - the current (and, therefore, power) are therefore dependent on the resistance of the bulb. The higher power bulb is designed to have a lower resistance filament (shorter length of thicker wire).

In fact, it's a bit more complicated with light bulbs, because resitance is affected by temperature - and the resistance of the filament is much higher at operating temperature than when cold. The manufacturers perform their calculations so that the resistance is correct when the bulb is operating at it's 'normal' temperature.

Does this new 120 W lightbulb have a 120 Ohm resistance only at 120 W or is that a hard set-in-stone resistance? I presume the latter, since radioshack sells resistors that have ohm ratings and do not seem to say what voltage those ohms are at.

For incandescent light bulbs where there is a huge temperature variation (the filament runs at something like 4500 F), things are complicated. If you increase the voltage, the current and power increase, which causes the filament to get hotter, which makes its resistance increase - partially cancelling out the rise in current/power. If you took a 120 W bulb and connected a normal multi-meter to it in 'resistance' mode, the meter would probably read about 20 ohms - that's because the filament is colder than normal operating temperature.

Proper resistors, like you get at radioshack, are designed so that their resistance is fixed and doesn't change significantly over their recommended temperature range.

Second question: Is a lightbulb or an LED or anything else of similar simpleness pre-set in its current and always will "ask" out of an electric circuit for that amount of current? In other words, the 60W bulb will always be asking for .5Amps and the 120W will always ask for 1 A so that if I took a 60W bulb and shot 240 volts through it, it would now have 120 watts going through it? OR, is it always pre-set in its resistance, so a 60W bulb knows it's 60W because its maker knew it would have 120 V and gave it an appropriate resistance to create the right current that would make 60 W? In this way, a 60 W lightbulb at 120 V would end up being 240 W at 240 volts because a doubling in volts also doubles its current, so now instead of .5A and 120 V for 60 W, it's 1 A (240 V/240 Ohms) and 240 V, so 240 W.

A lightbulb (see above) is a bad example because it is a complex system. But, ignoring the complexity, your calculations are correct.

A better example would be a water heater, because it's temperature is relatively constant. Let's say you've got a 3 kW heater, designed for 240 V. It has a resistance of 19 Ohms.

If you incorrectly connected it to a 120 V supply, the resistance would still be 19 ohms. I = V / R; I = 120 / 19 = 6.4 A; Power = V * I = 750 W

LEDs are a special case, and a different analysis is required - the behavor of LEDs confuses a lot of people because LEDs have very low resistance (so low, that in most calculations it can be assumed to be zero), instead what they do is they have a 'voltage drop'. Diodes, for reasons too complicated for me, require a certain voltage before any current will flow, and will absorb that voltage once current starts flowing.

Let's say you have a blue LED that has a voltage drop (at normal currents) of 3 V. You connect it to a 5 V supply and a resistor. The diode absorbs 3 V, leaving only 2 V in the circuit to pass through the resistor. However, the diode itself has no resistance of note, so the total current in the circuit is determined by the remaining voltage and the resistor. So, if the LED was rated for 0.02 A - you would need a 100 ohm resistor to limit the current to the appropraite level. (5 V - 3 V) / 0.02

If you didn't have the resistor, the low resistance of the LED would allow a massive current to flow, and the LED would burn out.

Third question: In the case of an LED like this (http://******/Product/led/LE4002.htm) would I treat its internal resistance as effectively 0 and that's how I would change its current (since it talks on there about the thing's continuous and max current)? So, if I hooked a 4 V battery directly up to the LED it may immediately with its low resistance, ask for a ton of power and burn itself up right away, which would mean that for me to get the correct resistance and play with those current figures, I could get the power I want?

That's correct. But see above, LEDs have a 'voltage drop' which needs to be considered as well.

Manfacturers provide a recommended current for LEDs - so you should perform your calculations to aim for the recommendation. Too much current, and they'll burn. Too little and the color may be change slightly, and brightness may be unpredictable. Although high quality LEDs are often come in 'matched' packs (very similar brightness and hue) - they are only matched at their recommended power, and the matching isn't as good if you use currents which are very different to the recommendation.

Fourth Question: Can resistors be run in series? I presume so; 120 ohm + 120 ohm = 240 ohm

Yup.


Fifth Question: Can LEDs take unlimited voltage? I know they won't run below their forward voltage, but if I have a typical LED and run a million volts through it, will it work as long as I have a miniscule current (so, vast amounts of resistance) so that the power is still in the range it likes?

Yup. LEDs have a relatively fixed 'voltage drop' and it's the current that matters. As long as you control the current, you'll be fine.

The problem you may run into is heating of the resistor. Let's say you've got 100 V, and an LED with 3V drop and recommended current of 0.02 A. You'll need a resistor of (100 - 3) / 0.02 = 4800 Ohms.

Calculate power in the resitor (I^2 * R): 0.02^2 * 4800 = 2 W. That means your resistor will be producing 2 W of heat - you will need to make sure that your resistor is sufficiently big, that it can get rid of the heat - otherwise it will overheat and burn.

Another helpful tip: Don't forget that LEDs can be connected in series - if you do this the voltage drops add.

Let's say you wanted 30 LEDs and had a 100 V supply. You could replicate the circuit above 30 times. You'd need 30 resistors, and the resistors would waste a total of 60W of power as heat.

Alternatively, you could connect the 30 LEDs in series. This would act like 1 diode with 90 V drop. This way you can use 1 resistor with a (100 - 90) / 0.02 = 500 Ohm resistance. Not only do you only need 1 resistor, but it can be smaller because it only needs to handle 0.2 W of heat.
 
Great, these responses all make sense and I understand it now.

So, 6V battery and a 4V LED, I can ignore the internal resistance of the LED and if I want a certain current, calculate the resistance I need based on that 2V discrepancy. However, based on a calculator I saw online, the power going through the LED will always be based on its 4V, so 4V * whatever current I pick = power going through LED. Therefore, a 4V LED with a 6V battery at 20 mA will be less bright (80 mW) than a 6V LED with an 8V battery and a 20 mA current (120 mW).

Also, since forward voltage is given as a range for LEDs and not necessarily a set figure, e.g. 3.8-4.2, it seems that if I had a 4.3V battery and this LED, it would be almost luck if I came up with the correct resistance, since 4.3-3.8 would give a far different resister than 4.3-4.2, in which case the larger the difference in battery and LED, the more accurate I could get, though this would come at the expense of lost W through the resistor, so a 500 V battery I could nail down the exact current I want for that LED but lots of juice would be lost in the resistor.
 
Originally posted by: Skoorb
Therefore, a 4V LED with a 6V battery at 20 mA will be less bright (80 mW) than a 6V LED with an 8V battery and a 20 mA current (120 mW).

LED power isn't a frequently specified parameter, except for thermal management. Brightness, voltage drop, and color of the LED is highly dependent on the exact performance of the die - so there's no guarantee that an LED with higher voltage drop (and therefore higher power at the same current) is brighter. It's just luck.

Also, since forward voltage is given as a range for LEDs and not necessarily a set figure, e.g. 3.8-4.2, it seems that if I had a 4.3V battery and this LED, it would be almost luck if I came up with the correct resistance, since 4.3-3.8 would give a far different resister than 4.3-4.2, in which case the larger the difference in battery and LED, the more accurate I could get, though this would come at the expense of lost W through the resistor, so a 500 V battery I could nail down the exact current I want for that LED but lots of juice would be lost in the resistor.

Yup. That's a problem. Easiest way to deal with this is to calculate the worst case scenario and aim just below the absolute maximum allowable power. If you have more time then you can find the appropriate resistor for each LED individually.

I built a whole load of LED lights a while ago, and I used 3 LEDs in series on a 12V supply. This meant the resistor had to be spot on (there was only about 1V across the resistor). Thankfully, I'd bought LEDs which came in matched boxes of 60 - the LEDs in each pack were precisely matched for voltage drop and brightness. This meant I only needed to work out the resistance once for each box of LEDs.

 
Just for point of fact, NO, there is NO way to run a LED with a million volts, you simply cannot "only draw a small amount of current". Just for starters the electrical connections would all arc at that voltge and the device would explode. Even at much more reasonable voltages you will reach the breakdown point of the semiconductor whereby the electric field would be sufficient to free carriers causing a runaway effect which will drastically reduce the resistance of the LED thereby leading to an invariable expolsion (tbh pretty much everything experiencing that high of an electric field will breakdown in one way or another resulting in explosion).
 
You all better answer these perfectly... for this is a question from the famed Skoorb.

If you give an incorrect answer, you will be shunned by all message board communities and shall spend the rest of your lives lurking!
 
Originally posted by: SonnyDaze
Will this plane be used on a treadmill? 😛
My mental faculties are too dull nowadays to get the joke 🙁

--

Hey, is it easy to buy a battery eliminator circuit (think that's the term), so that I could take a varying 7.2 V pack (varying because voltage drops as batteries are emptied) and get a constant 5V out of it? I know that quality receivers in RC planes have these, but I'm trying to put this on a low-end one that has an integrated circuit board and I don't know what it's putting out, so I'd rather just run a bunch of LEDs out of its battery pack.

 
Originally posted by: Skoorb
Originally posted by: SonnyDaze
Will this plane be used on a treadmill? 😛
My mental faculties are too dull nowadays to get the joke 🙁

--

Hey, is it easy to buy a battery eliminator circuit (think that's the term), so that I could take a varying 7.2 V pack (varying because voltage drops as batteries are emptied) and get a constant 5V out of it? I know that quality receivers in RC planes have these, but I'm trying to put this on a low-end one that has an integrated circuit board and I don't know what it's putting out, so I'd rather just run a bunch of LEDs out of its battery pack.

LM7805 3 terminal regulator should do the trick.
 
Originally posted by: 0
Originally posted by: Skoorb
Originally posted by: SonnyDaze
Will this plane be used on a treadmill? 😛
My mental faculties are too dull nowadays to get the joke 🙁

--

Hey, is it easy to buy a battery eliminator circuit (think that's the term), so that I could take a varying 7.2 V pack (varying because voltage drops as batteries are emptied) and get a constant 5V out of it? I know that quality receivers in RC planes have these, but I'm trying to put this on a low-end one that has an integrated circuit board and I don't know what it's putting out, so I'd rather just run a bunch of LEDs out of its battery pack.

LM7805 3 terminal regulator should do the trick.
I thought you were joking, but that seems it might do it, thanks 🙂
 
Originally posted by: Mark R
Originally posted by: Skoorb

Fifth Question: Can LEDs take unlimited voltage? I know they won't run below their forward voltage, but if I have a typical LED and run a million volts through it, will it work as long as I have a miniscule current (so, vast amounts of resistance) so that the power is still in the range it likes?

Yup. LEDs have a relatively fixed 'voltage drop' and it's the current that matters. As long as you control the current, you'll be fine.


That's incorrect. The voltage would breakdown the LED's semiconductor material and fry it, regardless of the current being passed through.

Even if it was just static electricity with hardly any current, you could still break down the semiconductor.
 
Originally posted by: 91TTZ
Originally posted by: Mark R
Originally posted by: Skoorb

Fifth Question: Can LEDs take unlimited voltage? I know they won't run below their forward voltage, but if I have a typical LED and run a million volts through it, will it work as long as I have a miniscule current (so, vast amounts of resistance) so that the power is still in the range it likes?

Yup. LEDs have a relatively fixed 'voltage drop' and it's the current that matters. As long as you control the current, you'll be fine.


That's incorrect. The voltage would breakdown the LED's semiconductor material and fry it, regardless of the current being passed through.

Even if it was just static electricity with hardly any current, you could still break down the semiconductor.
I guess he means within the realistic confines I'm talking about, although my hypothetical 1 million volts has been said to end up in a dead LED!

 
Back
Top