zazmat said:
Larry,
You need to look at power supplied not delivered. The delivered power is less as a result of heat loss.
No argument. However, in order to assign the amount of power lost to heat, as well as that condumed by, or delivered to, the load, you first have to calculate the total Kva supplied. That requires knowing the source voltage and entire circuit impedance.
The load's rated Kva consumption assumes that the design voltage is actually being delivered to the terminals. The best we can do is to calculate the load's impedance using the nameplate data, add that to the entire service-, feeder-, and branch-circuit conductor impedances.
That total impedance, along with the known or measured source voltage, will give us actual current, with which we calculate the voltage across each segment of the circuit, and thus power lost to heat, as well as the power eventually delivered to the load.
What's the point? Yeah, what was the point? Oh, yeah, I remember. The whole point is that voltage drop due to circuit conductor impedance causes an overall reduction in power consumption supplied, as well as delivered.
Only if a load is upsized in compensation of voltage drop in an attempt to maintain load power (which would actually increase voltage drop, unless conductors are also upsized) would it be possible to maintain supplied power.
Remember, most loads are not constant power, which would cause current to rise proportionately with voltage reduction. A buck-boost transformer used in boost mode would be an example of this done by design. You can't get more power out of a system than you put in.