Quote Originally Posted by jsmalone1 View Post
To keep it simple, I'm 5 years in on many LED installs, 3 years in with my homes and many other customers & all my friends now that the price point is reasonable. I had 1 (one) LED light fixture fail at install and that's it. I believe everyone is entitled to their opinion and so am I. In my opinion 60-80% energy savings is not snake oil, that's what my meter consistently shows. Or better, for the negative comments I have attracted here, it's math--60 watts or 9 watts? 85%? I no longer repair or replace any fluorescent, HID, incandescent fixtures or bulbs, etc. I have 60 watt LED bulbs approaching 15,000 hours of use and not a single failure.
The power going in is only one side of the story and equivalency is not applied consistently. 400W MH is around 42,000 lumen when it is new. A GE 165W retrofit LED for 400W MH is listed as 20,000 initial lumens. This is probably a reasonable estimate for MH moments before they fail.

The theory for L70 is that 30% is around when we start to notice a drop. Today's LED lamps are rated at 10,000 to 15,000 hours to 30% decay, but inappropriate use will cause an accumulation of light level loss with each generation of retrofit.

A 100W legacy light bulb is 1,600 lumens. Making comparison to this is a moot point since incandescent lamps are only rated 750 to 1,000 hours and traditional wattage are no longer produced. LED lamps with 15-1600 initial lumen are sold as 100W equivalent. After 30% degradation, it is closer to 1,100 lm which is equivalent to presently legal 53w halogen. Filament lamps require replacement often but the light output loss is negligible so 1,100 lumen is fairly well maintained until failure. The 53W halogen is not really equivalent to 100W 1,600 lm but it is equivalent to 1,600 lm LED and LLD 0.70 is the current IESNA LED design recommendation and for a 10,000 hr lamp that burns thousands of hours a year, I strongly agree and personally I would split it down the middle for a lamp with L70 of 50,000.. but the IES recommendation is to still use 0.70.

53W halogen yields almost half saving in power, and the 30% reduction in output will only be perceived like a 17% drop. Retrofit equivalency subjective evaluation on the day of installation and matching the head of LEDs to tail of HIDs in a bid to sell watt cutts will result in accumulation of lumen loss.

My justification to match terminal lumen of LEDs to initial lumen of halogen is the fact halogen has negligible light loss as well as ambient related variation. I'm in no way arguing these halogens are anywhere near as efficacious as LEDs but you get what's going on here. They're often pushed way too aggressively to cut down initial cost and wattage to crank out ROI on paper without producing comparable replacement.

Plus there is so much more, no apparent cycling effects, huge temperature operating range with no change in light output, and of course, very long life span. BUT, I have had many people that insist on fluorescent and I just tell them that I'm sure they are on sale, good luck. Or even better, ask my wife and all my friend's wives. They all love LED and have all asked why we didn't do this sooner---price.
They are quite tolerant on lower end of temperature, although HIDs and incandescent are also ambient independent. HID's warm up time is either show killer or irrelevant. Integral ballast LEDs and CFLs both suffer loss of useful life at high operating temperatures.

Quote Originally Posted by winnie View Post
One of the key points which electric-light brings up is that LEDs are often designed in ways where they will get dimmer over time (still using the same power) without outright failing.
Which is a trait also held by mercury vapor lamps. They were phased out in favor of metal halide that do burn out. The general practice of using initial lumens for matching and presumption of lasting at same performance until "L70" value needs to stop, because this is not appropriate except when comparing to MH using values more reasonable than end to initial.