LED Amperage Drops with time?

Status
Not open for further replies.

TonyVision

Member
Location
Texas
Hello,
I've been a lurker for a long time. I have learned a bunch here. Thank you to everyone who contributes.

I am a stage lighting designer for corporate and rock'n'roll. In our world of temporary installations, there is so much wrong information that gets handed down. You all have helped make me and my crews so much smarter and more safe.

Today, I was testing out the accuracy of a new excel calculator I made for unbalanced loads of mixed voltages used on a 120/208 portable power distro and I noticed something funny.

I had an LED fixture whose specs call for max 150w @ 120v. When measured for wattage and VA, the fixture was using 120w/122va on startup. In the course of two minutes, the watt usage dropped to 80w/83va. After another 5 minutes as the fixture heated up, the usage dropped to 50w/52va

Can anyone explain help why the drop in wattage used?
 

LarryFine

Master Electrician Electric Contractor Richmond VA
Location
Henrico County, VA
Occupation
Electrical Contractor
My semi-educated guess is that LEDs have a temperature coefficient (I think that's the correct term, but I don't remember whether it's positive or negative) which allows greater current with temperature rise.

The power supply apparently compensates with current regulation, and the input power used must actually decrease in response, which is something that switching power supplies do. That's my best guess.
 

TonyVision

Member
Location
Texas
Thank you Larry and gar. I had an inkling this is related to temp and switching supply, but I just couldn't find the right direction to start looking for information on it.

I was really surprised to see such a large drop. It was screwing up my early readings and I was confused.

I set up a 120/208 distro with 12 identical LED fixtues spread evenly across the phases to measure amps on the three hots and neutral feeder to a cam service.
6 @ 208v
6 @ 120v

I unbalanced the load in a dozen or so combinations and took readings. Then entered readings into a new calc I made to test it's accuracy. Did pretty good. I'll make another thread about that after some more testing.
 

synchro

Senior Member
Location
Chicago, IL
Occupation
EE
One possibility is that the LED driver is reducing its output current to keep the LED semiconductor junctions from getting too hot and reducing their lifetime:

https://www.digikey.com/en/articles...l-led-temperature-to-solve-ssl-thermal-issues

Was the light fixture measured under conditions where the LEDs may have gotten hotter than normal (e.g., hot ambient temperature, fixture not oriented as intended during usage, restricted air circulation, etc.)?
 

SceneryDriver

Senior Member
Location
NJ
Occupation
Electrical and Automation Designer
LED's forward voltage drop decreases with increasing junction temperature. The driver is indeed reducing drive current to the LED strings (by lowering the voltage) as the LED's heat up. If the driver wasn't constant-current, the LED strings would heat up and as they did, pull more and more current until they let the magic smoke out. Simple circuits simply use a resistor in series with the LED to limit current. Fancier designs (like your fixtures) use current limiting driver circuitry to limit the amount of power wasted in the resistors as dissipated heat.

I've had several instances on outdoor gigs where LED fixtures pulled more current than expected when the fixtures were used in cold conditions - holiday displays and the like in winter. It appeared that the drivers in the fixtures hit their current limit and happily chugged along drawing max current, without dropping as the fixtures warmed up; it was -5F that day, so the lights were happy. The stagehands on the other hand, were not.


SceneryDriver
 
Status
Not open for further replies.
Top