LED's efficiency exceeds 100%

Analog

Lifer
Jan 7, 2002
12,755
3
0
led100effici.jpg


(PhysOrg.com) -- For the first time, researchers have demonstrated that an LED can emit more optical power than the electrical power it consumes. Although scientifically intriguing, the results won’t immediately result in ultra-efficient commercial LEDs since the demonstration works only for LEDs with very low input power that produce very small amounts of light.

The researchers, Parthiban Santhanam and coauthors from MIT, have published their study in a recent issue of Physical Review Letters.
As the researchers explain in their study, the key to achieving a power conversion efficiency above 100%, i.e., “unity efficiency,” is to greatly decrease the applied voltage. According to their calculations, as the voltage is halved, the input power is decreased by a factor of 4, while the emitted light power scales linearly with voltage so that it’s also only halved. In other words, an LED’s efficiency increases as its output power decreases. (The inverse of this relationship - that LED efficiency decreases as its output power increases - is one of the biggest hurdles in designing bright, efficient LED lights.)
In their experiments, the researchers reduced the LED’s input power to just 30 picowatts and measured an output of 69 picowatts of light - an efficiency of 230%. The physical mechanisms worked the same as with any LED: when excited by the applied voltage, electrons and holes have a certain probability of generating photons. The researchers didn’t try to increase this probability, as some previous research has focused on, but instead took advantage of small amounts of excess heat to emit more power than consumed. This heat arises from vibrations in the device’s atomic lattice, which occur due to entropy.
This light-emitting process cools the LED slightly, making it operate similar to a thermoelectric cooler. Although the cooling is insufficient to provide practical cooling at room temperature, it could potentially be used for designing lights that don’t generate heat. When used as a heat pump, the device might be useful for solid-state cooling applications or even power generation.
 

sdifox

No Lifer
Sep 30, 2005
100,484
17,955
126
wait, what? so I just get a shit load of led and drive them at low power?
 

blinblue

Senior member
Jul 7, 2006
889
0
76
How does it create more energy in light than it uses? Shouldn't that be breaking a few laws there?

1285770302993.jpg
 

Matthiasa

Diamond Member
May 4, 2009
5,755
23
81
It's taking energy from the surrounding, subsequently cooling it. (As the article mentions)
 

tboo

Diamond Member
Jun 25, 2000
7,626
1
81
Reverse engineering the recovered material from the Roswell crash is starting to pay off!
 

PowerEngineer

Diamond Member
Oct 22, 2001
3,606
786
136
It's taking energy from the surrounding, subsequently cooling it. (As the article mentions)

If the temperature of the LED is 135 degress C, I'm thinking that perhaps it is glowing hot! Heat it up a little more and maybe you can have light with no electrical input. :p
 

adlep

Diamond Member
Mar 25, 2001
5,287
6
81
It might have something to do with cold fusion...
...
...
...
...
 

Red Squirrel

No Lifer
May 24, 2003
70,649
13,827
126
www.anyf.ca

It works if you plug a UPS into itself. (no not really)

Actually I did that at work to transport a bunch of UPSes as it held the cord wrapped around (plugged into surge strip part). My coworker freaked out a little when he saw that. :p
 
Last edited:

DrPizza

Administrator Elite Member Goat Whisperer
Mar 5, 2001
49,601
167
111
www.slatebrookfarm.com
Wait, what? You mean I can power my solar cells with LEDs, use some of that energy to power the LEDs and have leftover energy to run my television? That's awesome! No, wait, with all those LEDs on, I won't be able to see the television over the glare. Darn it.

<scampers off to actually read the paper.>
 

Analog

Lifer
Jan 7, 2002
12,755
3
0
Wait, what? You mean I can power my solar cells with LEDs, use some of that energy to power the LEDs and have leftover energy to run my television? That's awesome! No, wait, with all those LEDs on, I won't be able to see the television over the glare. Darn it.

<scampers off to actually read the paper.>


Yeah, if your TV runs in the picowatts.
 
May 11, 2008
22,598
1,473
126
What was the size of the led, is the big question here. Was it a normal led ? or a very small crystal. Because picowatts are a really small number. 1/10^12.

If very small, we would need about 10^10 leds for a 690mW led lamp assuming 69 picowatt output.

But making use of thermal energy from the surroundings is very smart.
But that also has drawbacks, i think. Look at the chart, the efficiency is higher at 25 degrees Celcius but drops down after the 1milliwatt (10^-3)mark.

Assuming that this effect holds at decreasing temperature, it will become more increasingly difficult to draw thermal energy from the surroundings.

But it is all theoretical science and may lead to a breakthrough when enough understanding of the subject is developed. All the magic is in the atomic lattice.
 

Train

Lifer
Jun 22, 2000
13,587
82
91
www.bing.com
couldnt you put like a million of these on a sheet, have a sort of ambient light made out of it, with the added effect of it cooling the room it's in?