I am trying to size the circuit for equipment that consists of compressors, heaters, and a transformer. I am trying to prove that what I see at 480V will be my worst case current draw due to the heater load. Or at least show that the amps are close enough to not be a concern.
I know the compressors will increase their amp draw when the voltage drops by anything other than the V/F ratio. So I I go from 460 to 380, I should not see a change. But if I go from 460 to 415V, there would definitely be an increase.
The heaters will increase/decrease by the square of their voltages. So there is definitely a change between 460V and 415V. The current will decrease.
I have a 3KVA transformer that has primary taps for 380/400/460/480V and a secondary tap of 110V. With the primary taps, I will be getting the full 3KVA (nominal) regardless whether I use 380/400/460/480V, assuming I use the correct taps.
I know 3000/380V =7.9A
and 3000/460V= 6.52A
So initially, on startup, the inrush and magnetization current will be more at 380V. After the initial inrush, the difference in current draw by the primary would only be the difference in current drawn from the loads?
Actually, it would still be the reverse of the primary to secondary ratio right? So if I am stepping down from 380 to 110V from primary to secondary, when I go to secondary to primary looking at current I would then step the current down by 3.5 times. So if my current increases by 10A on the secondary, it would only increase by 2.8A on the primary side.
So looking at 480V stepping down to 110V, I have a ratio of 4.36. So an increase of 10A from my secondary would result in an increase of 2.2A on my secondary.
If anyone could help me try to analyze this, I would appreciate. I want to convince myself that if I size the circuit for 460V + or - 5%, that I can use the same sized circuit for 400V + or - 5%.
So it is still more net amps at 380V.
I know the compressors will increase their amp draw when the voltage drops by anything other than the V/F ratio. So I I go from 460 to 380, I should not see a change. But if I go from 460 to 415V, there would definitely be an increase.
The heaters will increase/decrease by the square of their voltages. So there is definitely a change between 460V and 415V. The current will decrease.
I have a 3KVA transformer that has primary taps for 380/400/460/480V and a secondary tap of 110V. With the primary taps, I will be getting the full 3KVA (nominal) regardless whether I use 380/400/460/480V, assuming I use the correct taps.
I know 3000/380V =7.9A
and 3000/460V= 6.52A
So initially, on startup, the inrush and magnetization current will be more at 380V. After the initial inrush, the difference in current draw by the primary would only be the difference in current drawn from the loads?
Actually, it would still be the reverse of the primary to secondary ratio right? So if I am stepping down from 380 to 110V from primary to secondary, when I go to secondary to primary looking at current I would then step the current down by 3.5 times. So if my current increases by 10A on the secondary, it would only increase by 2.8A on the primary side.
So looking at 480V stepping down to 110V, I have a ratio of 4.36. So an increase of 10A from my secondary would result in an increase of 2.2A on my secondary.
If anyone could help me try to analyze this, I would appreciate. I want to convince myself that if I size the circuit for 460V + or - 5%, that I can use the same sized circuit for 400V + or - 5%.
So it is still more net amps at 380V.