gar
Senior Member
- Location
- Ann Arbor, Michigan
- Occupation
- EE
181124-0723 EST
codetalker:
You are correct that, if the manufacturer is to produce an electrically efficient small current source to drive the LED chips, then higher than normal AC line frequency will be required.
Flicker has more than one meaning. One is periodic and another is random.
A periodic flicker occurs when a movie projector is operated at a low frame rate. I believe one movie standard was 24 frames per second, and this seemed to work well. But when TV started out 30 was a logical value synced to line frequency, but this produced too much flicker, and the interlaced concept was invented. Interlacing raised the rate to 60 frames per second without increasing the bandwidth for the same resolution.
Random flickering is what one sees with a candle flame or a loose intermittent electrical connection.
Both these types of flicker occur with some LED bulbs.
I am not sure that it is necessary to go to pure DC at the LED to eliminate flicker for two reasons.
First, illumination type LEDs use a short wavelength LED, the LED itself is very fast, to excite phosphors of various colors in the visible spectrum. Phosphors have moderately long time constants compared to the exciting LEDs, they are a low pass filter. If the excitation pulse rate is high enough, then the phosphor should provide the DC filtering. So this should take care of the periodic flicker. Further the eye brain combination is a low pass filter.
Second, random flicker is probably a result of other problems than pulse rate. Going to pure DC does not solve this problem. Possibilities are --- loose heat sinks. This is a very long time constant, possibly minutes. --- Others are poor electrical design.
.
codetalker:
You are correct that, if the manufacturer is to produce an electrically efficient small current source to drive the LED chips, then higher than normal AC line frequency will be required.
Flicker has more than one meaning. One is periodic and another is random.
A periodic flicker occurs when a movie projector is operated at a low frame rate. I believe one movie standard was 24 frames per second, and this seemed to work well. But when TV started out 30 was a logical value synced to line frequency, but this produced too much flicker, and the interlaced concept was invented. Interlacing raised the rate to 60 frames per second without increasing the bandwidth for the same resolution.
Random flickering is what one sees with a candle flame or a loose intermittent electrical connection.
Both these types of flicker occur with some LED bulbs.
I am not sure that it is necessary to go to pure DC at the LED to eliminate flicker for two reasons.
First, illumination type LEDs use a short wavelength LED, the LED itself is very fast, to excite phosphors of various colors in the visible spectrum. Phosphors have moderately long time constants compared to the exciting LEDs, they are a low pass filter. If the excitation pulse rate is high enough, then the phosphor should provide the DC filtering. So this should take care of the periodic flicker. Further the eye brain combination is a low pass filter.
Second, random flicker is probably a result of other problems than pulse rate. Going to pure DC does not solve this problem. Possibilities are --- loose heat sinks. This is a very long time constant, possibly minutes. --- Others are poor electrical design.
.