• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Do LED lights flicker? (ac or dc current)

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: SparkyJJO
Originally posted by: ICRS
What is this lighting for if you don't mind me asking?

Also I wonder why MH (Metal Halide) lights aren't used to light homes and such, they have an efficiency rating that is higher or as high as any fluorescents. I think they give a nicer light too.

MH is more expensive isn't it?

Yes this could be the reason.
 
Originally posted by: EarthwormJim


DC current doesn't pulsate, hence direct current (vs alternating current).


It can still pulsate, and in many modern circuits it does. It doesn't reverse polarity like AC but it pulses the DC.
 
The LED taillights on many new cars have an obvious and annoying flicker to them. It's like looking at a CRT with the refresh set to 60 hz.

Cadillacs are an example.
 
Originally posted by: Special K
Originally posted by: Rubycon
Incandescent lights flicker off 60Hz but due to the attack and decay times of the filament, the AC carrier superimposed is buffered enough that human eyes cannot perceive it. A silicon cell connected to a small amplifier and pointed to a bulb will reveal a hum though.

LED's OTOH, have MUCH faster attack and decay times - the measure of how fast they reach brightness and how fast they go dark respectively. Thus if connected to a power source that fluctuates one is much more likely to detect aberrations in the incoming power. Many drivers will pulse the LED's to conserve power (or in the case of white LED's) to maintain a more constant tint over their operating brightness range. Most drivers will keep them in the near kHz range so the flicker does not become bothersome as well as the possible stroboscopic effect on fast moving objects.

An LED driven directly from a battery, will have no flicker whatsoever. If a battery light appears to flicker this happens because it's driven by a PWM converter which is often the case because it needs a higher voltage than the battery can produce and/or the builder wanted regulated (constant output) light over the life expectancy of the power cell.

My laptop's power LED will cycle between fully lit and fully off when my laptop is in standby mode. The increase/decrease in brightness of the LED seems linear to me. Given the steep IV curve of diodes, how is it possible to make an LED that behaves this way? Are they using a high-frequency PWM here, and the duty cycle determines how bright the LED appears to be? Did they fabricate an LED with a linear IV curve?

Almost certainly PWM. That's THE easiest way to drive an LED if you have a microcontroller available.
 
Originally posted by: 91TTZ
The LED taillights on many new cars have an obvious and annoying flicker to them. It's like looking at a CRT with the refresh set to 60 hz.

Cadillacs are an example.

Mainly Caddys. I haven't seen the issue on Infiniti and Lexus. It's more than annoying, it downright makes me ill.
 
The other thing that can make a DC connected LED flicker (usually as part of a sign or array of LEDs) is that the array is multiplexed such that only a small number are actually active / on at a time. LED signs are usually done like this, even some multi-segment displays.

Muxing the LEDs reduces the necessary power (light up 7 instead of 700); the sweep frequency of the array fast enough that it looks like all are lit.

FWIW
 
Originally posted by: ScottMac
The other thing that can make a DC connected LED flicker (usually as part of a sign or array of LEDs) is that the array is multiplexed such that only a small number are actually active / on at a time. LED signs are usually done like this, even some multi-segment displays.

Muxing the LEDs reduces the necessary power (light up 7 instead of 700); the sweep frequency of the array fast enough that it looks like all are lit.

FWIW

Good point. I remember doing a lab experiment where we swept the frequency at which the segments of a 7-segment LED display were multiplexed. We found that flicker fusion, the frequency at which the flickering of the lights becomes imperceptible, occured at only 30 Hz or so.
 
Originally posted by: Special K
Originally posted by: ScottMac
The other thing that can make a DC connected LED flicker (usually as part of a sign or array of LEDs) is that the array is multiplexed such that only a small number are actually active / on at a time. LED signs are usually done like this, even some multi-segment displays.

Muxing the LEDs reduces the necessary power (light up 7 instead of 700); the sweep frequency of the array fast enough that it looks like all are lit.

FWIW

Good point. I remember doing a lab experiment where we swept the frequency at which the segments of a 7-segment LCD display were multiplexed. We found that flicker fusion, the frequency at which the flickering of the lights becomes imperceptible, occured at only 30 Hz or so.

But, then try moving the LEDs around. As they move, each moment they turn on they leave an imprint on your retina. Effectively, you see a row of dots instead of continuous movement. We just finished a device at work that has 6 LEDs for button illumination and at a 120Hz PWM rate you can easily tell they are flashing when you move the device.

You can see the same effect with an AC (60Hz) fan and traditional fluorescent bulbs (that also run at 60Hz); look through the fan at the light and it will appear to not move. The motion and the light are synchronized since they run from the same signal so the light always strikes the blades when they are in the same positions and you can't see them when they are in between those positions.
 
Originally posted by: Special K

My laptop's power LED will cycle between fully lit and fully off when my laptop is in standby mode. The increase/decrease in brightness of the LED seems linear to me. Given the steep IV curve of diodes, how is it possible to make an LED that behaves this way? Are they using a high-frequency PWM here, and the duty cycle determines how bright the LED appears to be? Did they fabricate an LED with a linear IV curve?

This is normally done with PWM at a high enough frequency that you can't detect it. here is a project that produces the same effect using PWM on a microcontroller. If you have a micro with good PWM functionality, it is a very simple trick. I've implemented it on PICs, where it was a piece of cake.

Apple has a patent on this, for laptop power indicators.
 
Originally posted by: Aluvus
Originally posted by: Special K

My laptop's power LED will cycle between fully lit and fully off when my laptop is in standby mode. The increase/decrease in brightness of the LED seems linear to me. Given the steep IV curve of diodes, how is it possible to make an LED that behaves this way? Are they using a high-frequency PWM here, and the duty cycle determines how bright the LED appears to be? Did they fabricate an LED with a linear IV curve?

This is normally done with PWM at a high enough frequency that you can't detect it. here is a project that produces the same effect using PWM on a microcontroller. If you have a micro with good PWM functionality, it is a very simple trick. I've implemented it on PICs, where it was a piece of cake.

Apple has a patent on this, for laptop power indicators.

Interesting, I didn't realize you could patent something that trivial. Do you think Apple does it exactly as you described - with a PWM signal? My Dell laptop has what appears to be the same type of power indicator.
 
Originally posted by: bobsmith1492
Originally posted by: Special K
Originally posted by: ScottMac
The other thing that can make a DC connected LED flicker (usually as part of a sign or array of LEDs) is that the array is multiplexed such that only a small number are actually active / on at a time. LED signs are usually done like this, even some multi-segment displays.

Muxing the LEDs reduces the necessary power (light up 7 instead of 700); the sweep frequency of the array fast enough that it looks like all are lit.

FWIW

Good point. I remember doing a lab experiment where we swept the frequency at which the segments of a 7-segment LCD display were multiplexed. We found that flicker fusion, the frequency at which the flickering of the lights becomes imperceptible, occured at only 30 Hz or so.

But, then try moving the LEDs around. As they move, each moment they turn on they leave an imprint on your retina. Effectively, you see a row of dots instead of continuous movement. We just finished a device at work that has 6 LEDs for button illumination and at a 120Hz PWM rate you can easily tell they are flashing when you move the device.

You can see the same effect with an AC (60Hz) fan and traditional fluorescent bulbs (that also run at 60Hz); look through the fan at the light and it will appear to not move. The motion and the light are synchronized since they run from the same signal so the light always strikes the blades when they are in the same positions and you can't see them when they are in between those positions.

Did you mean to say "traditional incandescent bulbs"? I thought fluorescent bulbs used a much higher frequency via their internal ballasts.

Also, what would cause an incandescent bulb to exhibit visible flickering? Normally you don't perceive the 60 Hz flicker of an incandescent bulb, but occasionally I will see one, usually in an industrial setting, that is visibly flickering at a frequency far lower than 60 Hz.
 
Originally posted by: Special K

Interesting, I didn't realize you could patent something that trivial. Do you think Apple does it exactly as you described - with a PWM signal? My Dell laptop has what appears to be the same type of power indicator.

Their patent describes using PWM, which is kinda surprisingly specific. LadyAda recorded the waveform (which is here but didn't show up for me properly) showing how they vary of the brightness.

As I say, it's a pretty easy trick to pull off with a microcontroller's PWM output. And presumably they had a microcontroller in charge of the power light anyway. Really the hardest part is figuring out what sort of waveform you want the brightness to follow in order to look "natural".
 
Originally posted by: Special K
Originally posted by: bobsmith1492
Originally posted by: Special K
Originally posted by: ScottMac
The other thing that can make a DC connected LED flicker (usually as part of a sign or array of LEDs) is that the array is multiplexed such that only a small number are actually active / on at a time. LED signs are usually done like this, even some multi-segment displays.

Muxing the LEDs reduces the necessary power (light up 7 instead of 700); the sweep frequency of the array fast enough that it looks like all are lit.

FWIW

Good point. I remember doing a lab experiment where we swept the frequency at which the segments of a 7-segment LCD display were multiplexed. We found that flicker fusion, the frequency at which the flickering of the lights becomes imperceptible, occured at only 30 Hz or so.

But, then try moving the LEDs around. As they move, each moment they turn on they leave an imprint on your retina. Effectively, you see a row of dots instead of continuous movement. We just finished a device at work that has 6 LEDs for button illumination and at a 120Hz PWM rate you can easily tell they are flashing when you move the device.

You can see the same effect with an AC (60Hz) fan and traditional fluorescent bulbs (that also run at 60Hz); look through the fan at the light and it will appear to not move. The motion and the light are synchronized since they run from the same signal so the light always strikes the blades when they are in the same positions and you can't see them when they are in between those positions.

Did you mean to say "traditional incandescent bulbs"? I thought fluorescent bulbs used a much higher frequency via their internal ballasts.

Also, what would cause an incandescent bulb to exhibit visible flickering? Normally you don't perceive the 60 Hz flicker of an incandescent bulb, but occasionally I will see one, usually in an industrial setting, that is visibly flickering at a frequency far lower than 60 Hz.

Fluorescents with electronic ballasts (most modern ones and all compact ones) run at high frequencies. Traditional fluorescents run at 60Hz and the 60Hz flicker is painfully obvious (to some at least, like me).

For a flickering incandescent, who knows? In an industrial setting there could be fluctuations on the supply line due to heavy equipment. It could also be the filament physically moving due to low-frequency vibration from said equipment.
 
Originally posted by: ICRS
What is this lighting for if you don't mind me asking?

Also I wonder why MH (Metal Halide) lights aren't used to light homes and such, they have an efficiency rating that is higher or as high as any fluorescents. I think they give a nicer light too.
MH is expensive. But there are problems with MH as well.

Flicker - They suffer badly from 100Hz/120Hz flicker - worse than fluorescent.
Expensive - Lamps are expensive, and fittings are expensive too. The fitting has to be able to withstand catastrophic arc tube failure by containing high velocity lumps of white hot shrapnel.
Ballast requirements - magnetic ballasts for MH are much bigger and heavier that those for flourescent, so are more costly and less convenient to site. They may also have louder buzzing.
Environmental - MH bulbs contain substantial amounts of mercury - 10s of times as much as CFLs, as well as other ecological altrocities (such as thallium, caesium and radioisotopes of Krypton)
Long warm-up and cool down - 5 minutes to come up to full brightness, and a 15 minute restrike time make them totally unsuitable for use where they may need to be turned on and off regularly. They're great for offices, and other public spaces - where in a home, where light may not be requried all day, this really is a show stopper.

That said, their long life and high efficiency make them good for certain markets. Round here, all the street lights are progressively being changed from sodium (both low and high pressure) to MH. Ever been to a parking lot lit by low pressure sodium? Trying to find my car was hillarious.

I light (some of) my home with MH. Personally, I use Iwasaki Color Arc Renaissance 6500 K 150 W lamps. Lovely cool color temperature, a color rending index of 96 make it acceptable for color craft work (but still not perfect, reds are a bit undersaturated). I have on of these on my bedside table, connected to a timer. It's lovely waking up a 6 am on a Northern Winter morning, and to feel as if you're lying on a Hawaiian beach in the mid-day sun.
 
Originally posted by: Mark R
Originally posted by: ICRS
What is this lighting for if you don't mind me asking?

Also I wonder why MH (Metal Halide) lights aren't used to light homes and such, they have an efficiency rating that is higher or as high as any fluorescents. I think they give a nicer light too.
MH is expensive. But there are problems with MH as well.

Flicker - They suffer badly from 100Hz/120Hz flicker - worse than fluorescent.
Expensive - Lamps are expensive, and fittings are expensive too. The fitting has to be able to withstand catastrophic arc tube failure by containing high velocity lumps of white hot shrapnel.
Ballast requirements - magnetic ballasts for MH are much bigger and heavier that those for flourescent, so are more costly and less convenient to site. They may also have louder buzzing.
Environmental - MH bulbs contain substantial amounts of mercury - 10s of times as much as CFLs, as well as other ecological altrocities (such as thallium, caesium and radioisotopes of Krypton)
Long warm-up and cool down - 5 minutes to come up to full brightness, and a 15 minute restrike time make them totally unsuitable for use where they may need to be turned on and off regularly. They're great for offices, and other public spaces - where in a home, where light may not be requried all day, this really is a show stopper.

That said, their long life and high efficiency make them good for certain markets. Round here, all the street lights are progressively being changed from sodium (both low and high pressure) to MH. Ever been to a parking lot lit by low pressure sodium? Trying to find my car was hillarious.

I light (some of) my home with MH. Personally, I use Iwasaki Color Arc Renaissance 6500 K 150 W lamps. Lovely cool color temperature, a color rending index of 96 make it acceptable for color craft work (but still not perfect, reds are a bit undersaturated). I have on of these on my bedside table, connected to a timer. It's lovely waking up a 6 am on a Northern Winter morning, and to feel as if you're lying on a Hawaiian beach in the mid-day sun.
I like 5500ºK CFLs, but most other people seem to find them obnoxious... 😕

 
Originally posted by: Eli
Originally posted by: Mark R
Originally posted by: ICRS
What is this lighting for if you don't mind me asking?

Also I wonder why MH (Metal Halide) lights aren't used to light homes and such, they have an efficiency rating that is higher or as high as any fluorescents. I think they give a nicer light too.
MH is expensive. But there are problems with MH as well.

Flicker - They suffer badly from 100Hz/120Hz flicker - worse than fluorescent.
Expensive - Lamps are expensive, and fittings are expensive too. The fitting has to be able to withstand catastrophic arc tube failure by containing high velocity lumps of white hot shrapnel.
Ballast requirements - magnetic ballasts for MH are much bigger and heavier that those for flourescent, so are more costly and less convenient to site. They may also have louder buzzing.
Environmental - MH bulbs contain substantial amounts of mercury - 10s of times as much as CFLs, as well as other ecological altrocities (such as thallium, caesium and radioisotopes of Krypton)
Long warm-up and cool down - 5 minutes to come up to full brightness, and a 15 minute restrike time make them totally unsuitable for use where they may need to be turned on and off regularly. They're great for offices, and other public spaces - where in a home, where light may not be requried all day, this really is a show stopper.

That said, their long life and high efficiency make them good for certain markets. Round here, all the street lights are progressively being changed from sodium (both low and high pressure) to MH. Ever been to a parking lot lit by low pressure sodium? Trying to find my car was hillarious.

I light (some of) my home with MH. Personally, I use Iwasaki Color Arc Renaissance 6500 K 150 W lamps. Lovely cool color temperature, a color rending index of 96 make it acceptable for color craft work (but still not perfect, reds are a bit undersaturated). I have on of these on my bedside table, connected to a timer. It's lovely waking up a 6 am on a Northern Winter morning, and to feel as if you're lying on a Hawaiian beach in the mid-day sun.
I like 5500ºK CFLs, but most other people seem to find them obnoxious... 😕

They remind me too much of working in the lab, which is not something I want to be reminded of when I am at home 😉
 
Originally posted by: Jeff7
Rubycon - are PWM drivers cheap to make? I built a small circuit based on a chip that Linear makes. They call it a "buck-boost" driver.

The circuit as I made it is able to convert 3-18VDC input to 35.9, and I can't see any flickering at all in the output.

I do see that "stroboscopic effect" in cars a lot though. Is that simply because lower-frequency drivers are cheaper, and thus more prevalent?



Always nice of you to drop by in these threads. 🙂

All that buck boost does is have the pluse switch on and off into an inductor. Pulsating current on inductors lead to voltage spikes. Constant switching of current leads to the higher voltage (this is all governed by the equation of an inductor). This is how a switching power supply works. Yes, it is stable on the output end....until you put a load on it (power in equals power out).

PWM drivers are not cheaper, but they are more reliable. They are used in many upping voltage apps, but that is not how they are used on LEDs. Basically, you use a pulse (you can create oit with a 555 timer chip) and you control the high time with resistors and caps.


 
Originally posted by: Special K
Originally posted by: Aluvus
Originally posted by: Special K

My laptop's power LED will cycle between fully lit and fully off when my laptop is in standby mode. The increase/decrease in brightness of the LED seems linear to me. Given the steep IV curve of diodes, how is it possible to make an LED that behaves this way? Are they using a high-frequency PWM here, and the duty cycle determines how bright the LED appears to be? Did they fabricate an LED with a linear IV curve?

This is normally done with PWM at a high enough frequency that you can't detect it. here is a project that produces the same effect using PWM on a microcontroller. If you have a micro with good PWM functionality, it is a very simple trick. I've implemented it on PICs, where it was a piece of cake.

Apple has a patent on this, for laptop power indicators.

Interesting, I didn't realize you could patent something that trivial. Do you think Apple does it exactly as you described - with a PWM signal? My Dell laptop has what appears to be the same type of power indicator.


there are so many different ways to implement it. Apple probably wanted to cut costs so they implemented a cheaper solution on a chip that was not supposed to be used that way.

 
Anyone know why 10,000+K lights aren't common for the home. I have some and they seem to give a nice bluish white light. Mines are 14000K.
 
Originally posted by: ICRS
Anyone know why 10,000+K lights aren't common for the home. I have some and they seem to give a nice bluish white light. Mines are 14000K.

Efficacy is too low (you're giving up a lot of usable lumens for those high CT's) AND if they're really that high the light is going to be really cold unless it's compensated with another source of red.

As far as flicker goes the key thing to remember is if the emitter's attack and decay times are very fast the device's out can reflect subtle aberrations making them noticeable. This can be buffered with a well planned PWM design, however.

Driving an LED directly off AC will result in a flicker that's noticeable as the flicker of a NE2 glow lamp. :Q
 
Originally posted by: Rubycon
Originally posted by: ICRS
Anyone know why 10,000+K lights aren't common for the home. I have some and they seem to give a nice bluish white light. Mines are 14000K.

Efficacy is too low (you're giving up a lot of usable lumens for those high CT's) AND if they're really that high the light is going to be really cold unless it's compensated with another source of red.

As far as flicker goes the key thing to remember is if the emitter's attack and decay times are very fast the device's out can reflect subtle aberrations making them noticeable. This can be buffered with a well planned PWM design, however.

Driving an LED directly off AC will result in a flicker that's noticeable as the flicker of a NE2 glow lamp. :Q

My 13000 lumens 14000K light seems bright to me.
 
I have various HID studio lights at 50W. The 4100K bulbs are a LOT brighter than the 7000K ones. The 7000K ones are very cold and make the reds look really lifeless. If the camera white balance is not adjusted correctly the pictures look incredibly cold.
 
Originally posted by: Rubycon
I have various HID studio lights at 50W. The 4100K bulbs are a LOT brighter than the 7000K ones. The 7000K ones are very cold and make the reds look really lifeless. If the camera white balance is not adjusted correctly the pictures look incredibly cold.

Ah so even though my lights seem bright now, if it was 5000K instead of 14000K it would look even brighter.
 
Back
Top