winnie
Senior Member
- Location
- Springfield, MA, USA
- Occupation
- Electric motor research
Transformer impedance is a measure of the voltage drop 'built in' to the transformer.
You expect voltage drop in a wire run; when you have no load on the wire, the voltage at the end of the wire is the same as the voltage on the supply side. As you increase the load and draw more current, more and more voltage gets 'used up' in the wire, and the voltage at the load side goes down.
A similar effect is seen in the transformer itself. The voltage on the output terminals goes down as the load increases. Different transformers have different impedance ratings. The impedance rating on a transformer is simply the % voltage drop at full load versus no load.
In your installation, you are using 75kVA transformers at about 60kVA. If these transformers had 3% impedance rating each, then the string of 3 transformers would result in a 7.2% voltage drop versus no load, substantially in addition to any voltage drop in the conductors themselves. This number should give you an idea of the issue that you are dealing with, but is not a hard number by any means.
This answer is going to be approximate for several reasons: I am ignoring things like the power factor of the load and the long line, and I am just guessing at the transformer impedance ratings. You can shop for different impedance ratings; just don't ignore this rating and buy 75KVA transformers on price. If you oversize the transformer the 'voltage drop' at any given load will be reduced.
I am not qualified to answer your question about the difference in drop between single conductors and cable. I would expect a difference in cable inductance and thus drop associated with power factor, but I don't know the details.
-Jon
You expect voltage drop in a wire run; when you have no load on the wire, the voltage at the end of the wire is the same as the voltage on the supply side. As you increase the load and draw more current, more and more voltage gets 'used up' in the wire, and the voltage at the load side goes down.
A similar effect is seen in the transformer itself. The voltage on the output terminals goes down as the load increases. Different transformers have different impedance ratings. The impedance rating on a transformer is simply the % voltage drop at full load versus no load.
In your installation, you are using 75kVA transformers at about 60kVA. If these transformers had 3% impedance rating each, then the string of 3 transformers would result in a 7.2% voltage drop versus no load, substantially in addition to any voltage drop in the conductors themselves. This number should give you an idea of the issue that you are dealing with, but is not a hard number by any means.
This answer is going to be approximate for several reasons: I am ignoring things like the power factor of the load and the long line, and I am just guessing at the transformer impedance ratings. You can shop for different impedance ratings; just don't ignore this rating and buy 75KVA transformers on price. If you oversize the transformer the 'voltage drop' at any given load will be reduced.
I am not qualified to answer your question about the difference in drop between single conductors and cable. I would expect a difference in cable inductance and thus drop associated with power factor, but I don't know the details.
-Jon
