My statement made me curious if due to increased losses if the system then had to supply additional power.
For instance lets say we had a 100W light at 100V. The current in the line would then be 1A. But lets say due to the impedance in the line we had 10W loss through I^2R losses. So does this mean then that the light only recieves 90W at its input, or does it increase its current to draw the rated 100W and thus cause the system to supply more than 100W to supply the losses and the light?
My statement made me curious if due to increased losses if the system then had to supply additional power.
For instance lets say we had a 100W light at 100V. The current in the line would then be 1A. But lets say due to the impedance in the line we had 10W loss through I^2R losses. So does this mean then that the light only recieves 90W at its input, or does it increase its current to draw the rated 100W and thus cause the system to supply more than 100W to supply the losses and the light?
The light, as a linear load, would see only the voltage across it, reducing the light output.So does this mean then that the light only recieves 90W at its input, or does it increase its current to draw the rated 100W and thus cause the system to supply more than 100W to supply the losses and the light?
That would only occur if the supply voltage was increased as compensation. That also means the terminal voltage would rise above nominal if the load was decreased.The light bulb will still draw 100W (for the same voltage at its terminals) and the wire losses of 10W will be drawn, so that the total power delivered is 110W.
Yes the system will have to provide any losses in addition to the load requirements all things being equal
The light bulb will still draw 100W (for the same voltage at its terminals) and the wire losses of 10W will be drawn, so that the total power delivered is 110W.
I disagree.
How can adding resistance increase the power?. Please, show me the math. If what you were saying was correct, putting two 100 watt bulbs in series would give 200 watts of power. (Remember, the conductors are in series with the load, not parallel - important difference.)
If the system is designed properly (i.e. NEC) these loses won't add up to anything significant.
But who owns the transformer? For the most part, the transformer losses are absorbed by the PoCo because you are metered off of the secondary side. That's partially why they assess PF penalties on large users. They have to size the transformer to supply all of the kVA, but they can only bill you for the kW you use from it and losses in the transformer are based on the kVA, so they eat them too....However, I would say there is some potential cost savings from the reduction in transformer losses by shifting the lighting and motor loads to the 480/277V side.
I can't see even the transformer losses being worth it....i.e. switch from 120V lighting to 277v lighting, or replacing a 230V motor with a 480V motor
Oh wait, I see what you mean. If you have 480/277 and you have a transformer to give you 120V, YOU own those transformer losses. Point taken.
Still, is the ROI there to justify re-wiring everything and replacing all the ballasts? If it were a new installation, the point is more valid because the initial cost is probably equal. But the OP was about retrofitting.
I can't see even the transformer losses being worth it.