Okay I first try give a outline of principles involved.We have to agree first.
%voltage increase = KVAR of capacitor x% impedance of transformer
............................. KVA of Transformer
% impedance of transformer may be 2 to 9% and depending on the low lagging power factor of load, KVAR of capacitor may be equal to KVA of Transformer.So a voltage rise of 9% can exist.By selection of cable with suitable voltage drop/meter,the entire voltage increase of 9% may be dropped along the cable connecting the load so that rated voltage is available across the loads terminals.But if permissible voltage variation across loads terminal is 5%,it may be exceeded at very light load condition.
Bit of a muddle, old chap.
This:
" %voltage increase = KVAR of capacitor x% impedance of transformer
............................. KVA of Transformer"
might have been taken as a rule of thumb from anywhere. It takes no account of the transformer impedance components, supply impedance or other connected loads. I prefer to work from actual values.
Transformer impedance is what it is.It's usually stamped on the nameplate. It doesn't depend on loading.% impedance of transformer may be 2 to 9% and depending on the low lagging power factor of load
Not in the real world.So a voltage rise of 9% can exist. By selection of cable with suitable voltage drop/meter,the entire voltage increase of 9% may be dropped along the cable connecting the load 9% may be dropped along the cable connecting the load
Last edited: