benefits of switching lighting from 120V to 277V

Status
Not open for further replies.

VoltageHz

Member
Location
NJ
A while back a coworker and I were on a large job in which one of the tasks was to replace about 1,000 high hats. We replaced the old 120V incan lights with new HID lights with smart ballasts, the idea was to continue to run them at 120V until all the lights were finished, then at the end switch all the lighting circuits over to 277V.

I was responsible for changing the circuits over, I did so in the contactor panels by pulling out the old wiring coming from the 120/208V panels, and replacing it with wiring from the 277/480V panels. I did not change anything else, the lamps in all the lights stayed the same, of course.

My coworker's assessment is that switching the lights over to the higher voltage would lower the electricity bill because somehow it would use less power. After further investigating he now believes that it will use the same amount of power, but somehow the power factor will change and he still stands by his assumption of a lower electriciity bill.

My assumtion is that the costs will stay the same, the only real benefit is the ability to put more devices on the same curcuit. However, since the circuits were already ran out (old wiring was used) the only benefit left was to free up many panels. Other than that, I'm not seeing any real benefit to changing those lights over from 120V to 277V.

So what are the benefits, why did we change over?
 
Watts is watts, so to a crude approximation the power costs will be the same.

However:
1) The ballasts themselves may be more or less efficient at 277V.
2) The reduced current and higher voltage means lower losses in the wiring.
3) Possibly other factors.

-Jon
 
Watts is watts, so to a crude approximation the power costs will be the same.

However:
1) The ballasts themselves may be more or less efficient at 277V.
2) The reduced current and higher voltage means lower losses in the wiring.
3) Possibly other factors.

-Jon

That's what I originally thought.

1) The smart ballasts had the current draw written on them for each voltage. When I multiplied that current draw times the voltage for both 120V and 277V, it came out to about the same number of watts. So I believe the difference in efficiency at either voltage is negligible. Would that be a correct assumption?

2) In my scenario we used the old wiring when we switched from incan to the HIDs. The new HIDs use MUCH less power. When I amp probed the circuits with the old incan lights they were averaging 14 amps. After installing the new HID lights (still at 120V) they were averaging about 2 amps. When I finally switched it over to 277V I put about 10 old circuits on each of the 277V circuits. Knowing that, I'd assume that the loss issues are also negligible, correct?
 
1) The smart ballasts had the current draw written on them for each voltage. When I multiplied that current draw times the voltage for both 120V and 277V, it came out to about the same number of watts. So I believe the difference in efficiency at either voltage is negligible. Would that be a correct assumption?

That is probably correct. I could still imagine ways that small differences could creep in (different power factor, different proportions of loss in ballast versus power delivered to the lamp, etc), but I'd go with 'same efficiency' unless something indicated a difference.

2) In my scenario we used the old wiring when we switched from incan to the HIDs. The new HIDs use MUCH less power. When I amp probed the circuits with the old incan lights they were averaging 14 amps. After installing the new HID lights (still at 120V) they were averaging about 2 amps. When I finally switched it over to 277V I put about 10 old circuits on each of the 277V circuits. Knowing that, I'd assume that the loss issues are also negligible, correct?

Did you actually pull out circuit conductors so that you had more lamps on a single conductor, or did you combine circuits at the contactor/breakers? If your old wiring (originally carrying 14A at 120V) is still being used (now carrying 1A at 277V) then you will be seeing much less loss in the circuit conductors.

-Jon
 
Did you actually pull out circuit conductors so that you had more lamps on a single conductor, or did you combine circuits at the contactor/breakers? If your old wiring (originally carrying 14A at 120V) is still being used (now carrying 1A at 277V) then you will be seeing much less loss in the circuit conductors.

-Jon

I combined the circuits inside of the contactors. Basically, each circuit I ran out of the 277V panel was jumped across the line side of 10 contactor poles.

I see what you're saying about the loss.
 
Probably a good move.
Losses on the wire are not as significant as you might think, but it does add up when you have 1000 fixtures. The only time it usually makes much of a difference is if the wiring is undersized to a point where it is causing a severe voltage drop. When you do that, the wire heats up and the circuit resistance increases. By increasing the voltage and dropping the current, you are no longer seeing that VD and the additional losses go away. The VD is essentially causing an added parasitic resistance load.

However, normal wire resistance losses are only related to the load at any given point on the string. Since you know what the current draw was and is now on each fixture, you can calculate normal losses based on the circuit resistance on any given fixture on each string. Lets take a 'frinstance on one fixture.

Assuming 12ga wire, 1,000 feet to the ballast farthest away from the controller:
12ga = .00187 ohms per foot = 1.87 ohms total circuit resistance. So P = I^2 * R, therefore Power dissipated at 120V, 2A = 7.48W total circuit losses. At 277V, 1A, the losses become 1.87W. So although it looks like a lot of percentage, it is a very small number. Basically, you have saved 5.61W for that farthest fixture. So on that fixture, assuming 15 cents per kWH, running 10hrs/5days/52weeks, you saved $2.19 per year in wire resistance losses. Keep in mind, that would be the BEST performing savings of all of them, less for each one closer to the source. So if the first one is say, 25ft from the contactor, you save roughly 5 cents per year. To figure out the total savings, you would need to know the run lengths of each fixture in each of the 10 strings, but suffice to say it only adds up because you have so many fixtures. I'd say probably just under $1000 per year. Not bad for what was probably a couple of hours to change the circuit?

The other savings you have is in transformer losses. Since you apparently have 480Y277 as a service drop, you must have a 480-208Y120 transformer somewhere. By now changing to tapping directly off of that 480Y system, you have eliminated the losses in that 208Y120 transformer, as well as the slight circuit losses feeding it. Bonus!

By the way, power factor would be the same, watts the same, efficiency the same, the only savings is going to be in that circuit resistance as mentioned.
 
i would have inspected every inch of that wiring and opened up every JB, to make sure nothing else was tied into it, someone didnt steal a neutral, etc.
 
In what way is 277V more efficient?
Less power will be expended in voltage drop in the conductors, but that wouldn't matter unless higher-power lighting was used to compensate.

In something like electric heating, that actually could make a difference, because the heat would run a bit longer (unless the wires were in the heated space.)
 
Converting an existing installation from 120 volts to 277 volts should be slightly more efficient, but only slightly.
The existing wiring has been reused, and is therefore now carrying a much lower current than was the case originaly. This will substantialy reduce the losses in the wiring, BUT if the original wiring was correctly specced, these losses will be small.
As an example, presume that the voltage drop in the original wireing was 3 volts (2.5%). When the wireing is reused at 277 volts, the reduced current will result in a voltage drop of about 1.3 volts which is under 0.5%. That suggests a saving of about 2% on the power bill, not much but worth having on a large installation.

I would also expect the ballasts to be slightly more efficient on 277 volts than 120 volts. All electronic ballasts use DC internaly, with the supply being rectified. The losses in the rectifier are the product of the supply current and a fixed voltage drop in the diodes, and therefore much less on 277 volts than at 120 volts. I would expect a saving of about 1 or 1.5%.

Also it is probable that the building has a 277/480 volt service with 120/208 or 120/240 being obtained from a transformer, in which case running the lighting at 277 volts will avoid transformer losses of a few % in addition.

In total a saving of about 5% might be achieved, not worth the trouble for few lamps, but worthwhile for 1,000.

Freeing up transformer capacity and breakers in the 120 volt boards would be incidental advantages.
 
Sometimes there's also a benefit, very small, of saving 1 or 2 watts per ballast by operating at 277V instead of 120V. But this is dependant on the ballast and lamp combination. One combination for one manufacturer may show a 1 watt savings while another manufacturer doesn't. Here's an example of one ballast. It shows a small savings for some lamps when operated at 277V.
http://www.unvlt.com/productLines/ap_sheets/ELFB/B232PUNVHP-A.pdf

If you calculated the VA from the fixture and it was the same at each voltage then I think the only savings you would have is reduced wire loss as others stated and freeing up some 120V circuit breakers.

Just to give perspective, 1VA savings for each of 1000 fixtures operated 10 hours a day for 250 days a year at $0.08/kVA would save the building owner $200 per year.
 
Status
Not open for further replies.
Top