Sorry if I missed it, but:
Is an LED even any more efficient when you consider what you have to build into an array of LED's inside the 'bulb'? E.g. Large and/or many resistors?
Good LED sources use a constant current driver, which will adjust its voltage so that the current into the LED(s) stays the same.
From LED to LED, the forward voltage varies, due to manufacturing process variations. Reds can go as low as 1.8V, and blue can go up to 4.00V. So if you were to apply 1.8V to a Rebel LED, you might push 350mA through it. Apply 4.00V to that same one, and you'll probably damage it - the voltage-vs-current curve is quite steep, so a tiny change in applied voltage can cause a large increase in current.
Even if you had an LED capable of taking 110v, wouldn't it basically be a short without other electrical components? LED's don't have resistance, do they?
All things have resistance, except superconductors.
An incandescent filament is just a wire that's placed across the hot and neutral terminals of an outlet. It has a resistance, especially once it's hot, and that's what regulates its current.
Ohm's law: I=V/R
For that, it's perfectly linear. If the resistance stays the same, and the voltage changes, the current will change linearly.
For LEDs, it's not linear. For that low-voltage red LED I mentioned, going from 1.8V to 2.0V could easily double the current, up to 700mA. 4V might try to push 3000mA through it, which would either blow out the bond wires, or else start to damage the die.
So, they don't quite work the same way as a resistor. They're specified with that forward voltage drop, Vf.
Meaning, if you apply power to it, the voltage on one side is going to be Vf less than on the other. The voltage "lost" there is used to push charge through the LED die. In the process, the electrons lose energy, which is converted to light and heat.
The other thing to watch for is that the forward voltage changes with temperature. Heat it up, and the forward voltage drops. The result is that the voltage required to push a certain current through the LED is reduced. So if you turn on the LED at 1.8V, you might get 350mA through it. But now it's heating up. Now the forward voltage is 1.
7V, so the 1.8V still applied to it is now able to push 400mA through. So you've got more heat being generated. Down goes the Vf again. Now the effective Vf is 1.6V. Current jumps up again, and you've got thermal runaway in progress.
Having a current-limiting resistor in series is one way of dealing with this, though having many LEDs in series means you can have a wide total voltage swing.
The ideal method is that constant-current power supply. It will sense that the load is suddenly drawing slightly more current than it was a moment ago, so it will automatically lower its output voltage to maintain the normal current output.
I know normal diodes don't (when forward-biased), but I've never quite understood what gets added to to make light...obviously you can't make light energy out of nothing, so they have to consume some voltage, right?
It's the chemicals in them that convert electrical energy to light.
For high-brightness LEDs, there are two primary constructions that I usually see:
InGaN and AlInGaP
Indium gallium nitride (blue and green) and Aluminum Indium Gallium Phosphide (red and amber).
Different doping levels give the different colors.
And, it's "amber," which is orangish and not yellow, simply because that's all the AlInGaP tech can do. There's a gap right between these two chemistries, and that's where yellow is.
There is a GaP LED, gallium phosphide, that
can do real yellow, but I've only ever seen them used as panel indicators; they just can't manage any appreciable brightness.
If you're buying LEDs by the reel, you can specify a bin. The best I've seen for amber is 584.5 - 587 nanometers, which is about as close to yellow as you're going to get right now, without mixing together red and green emitters.
A little more about that voltage drop: Say you've got a regular silicon diode. 0.7V drop. It doesn't matter how much voltage you're applying across it, the voltage on one side will be 0.7V lower than on the other. Then the amount of heat it generates is that 0.7V * the current in amps passing through. So it's not really like a resistor, where the voltage drop is dependent on the current. LEDs have the voltage drop, but it's not so constant, and it is indeed quite easy to kill an LED like that. (9V across a green LED will briefly give you an orange LED, assuming it doesn't blow itself apart like a tiny firecracker.

)
I also hate how bright LEDs tend to be, I have covered up many an unnecessary LED on console systems, or computer components (speakers, monitors have several, my case had a bunch inside the front fan I disconnected... just ripped the cords out) and some others which I deem more important, I have tried covering in Lithographer's Tape. That was supposedly going to dull their brightness but it didn't really... not much.
A black Sharpie will help a bit.
If that fails, electrical tape.
That's a matter of the manufacturers driving the LEDs hard, or else specifying bright ones. Bright blinky lights sell more stuff, I guess.
When I'm putting indicator LEDs on something, it's a balance of making sure they're visible to someone wiring up the product in daylight, but also so that they're not blinding if they're working at night.
When I was still in the Navy earlier this year I had a small barracks room and all 3 current gen consoles, a big computer, TV, alarm clock, etc... and the number of lights at night in that small space when I'd try to go to bed, was what inspired me to start covering shit up.
I'd use LED light bulbs only when they get to the point where they can be really dimmed. Are they there yet?
The problem is that triac dimming that's commonly used.
Dimming LEDs is darn easy: PWM. Pulse width modulation. You turn the LED on and off
really fast (I prefer
at least 5kHz, so that I can't see the flickering), and as you vary the amount of time that the LED stays on, your eye registers that it's getting dimmer and brighter.
The problem is that we don't use dimmers that send out this kind of pulsing signal, so the power supplies need to be a little more complicated so that they can use the (intentionally) malformed AC waveform to tell the output that it needs to dim.
I meant complete setup as in a bulb that's ready to go into a regular E26 fixture. I checked out Philips' residential product line earlier and according to their product literature, it's all 2700k. Guess that shouldn't be a surprise because one of the "negatives" of CFLs, or anything fluorescent, during the switchover was the deathly color temperature -- exactly what I'm looking for. I wouldn't use 4100k for a full room though, so bit of a niche.
Also, I hope they start frosting the lens over LED diodes more often. It's a lot of light for a small area, and Ikea (and probably the EU) throws a Class 2 "laser" warning on them. My 3D Maglite is even worse.
Ah, ok.
It's an interesting issue: The Sun's surface is in the 6000 Kelvin neighborhood, and our atmosphere scatters blue light all over the place, so you'd think we'd be used to that. Instead, we've grown accustomed to light that's about half that temperature, like it's an early sunset indoors. Maybe it's like the association we've formed between high framerate video and cheap daytime TV shows - so we're stuck, for now, with lots of movies that are at a slideshow-level 23.976fps.
Part of the problem is likely the lousy CRI of a lot of early CFL and LED bulbs, and the cheaper ones still on the market today. Even those Luxeon and Xlamp emitters from Philips and Cree can go down as low as 60CRI.
Admittedly, I've come to like light around the 2700K-3000K range.
I'm actually in the process of constructing some manner of indoor light using some 90-95CRI 3000K Rebels, on my own circuitboard design. Damn bright little suckers. :awe:
Sure it's going to be a bit more expensive than your average floor lamp, and I'm not entirely sure how to handle the thermal situation....but it's also kind of a hobby. I've loved LEDs ever since at least middle school. It was high school when I bought my first blue LED; they'd only been in existence for a year or two at the time. So awesome.

Then a friend picked it up to see what the fuss was about, and hooked it straight to a 9V battery. *
sigh* This is why we can't have blue things.