iwire said:
I understood your point, I just don't agree. :smile:
Well, allow me (as he cracks his knuckles):
IMO there is no instance where using larger supply conductors will not result in less energy usage regardless of the load. I agree that different loads will result in more or less % of savings but it will never swing the other way with larger conductors resulting in higher energy costs.
Let's take a hypothetical 1200 watt resistive load. With a 120v rating, the resistance should be 12 ohms, and the current 10a.
Now, let's introduce 2 ohms of series supply resistance. Now, the load is 14 ohms, the current is roughly 8.6a, and the total power is 1032w.
The intended load receives 102v, and the line voltage drop is 17.2v. The load's power is 867w, and the line's power loss is 146.2w.
(Numbers don't add up due to rounding.)
The electric meter will run slower with the resistive supply, so, with no compensation for reduced power made, the cost per hour is less with the poorer supply system.
If this was a heating load, there would definitely be a money savings to minimize voltage drop, because the heater would have to run longer in order to deliver the same heating energy.
The loss would be two-fold: the heating element receives less voltage, and has to run longer to deliver a given kw/hr, and the line loss can never be recovered.
However, if this was a lighting load, the lights would be dimmer, but, unless higher-wattage bulbs are used to compensate for reduced voltage (which would exacerbate the problem and require even more current), the power consumption would drop.
Result: larger wire = higher cost.
That said, I will still be using 14 AWG. :smile:
Same here.