Ahmed Abdelfattah
Member
- Location
- Edmonton, Alberta, Canada
Greetings,
I have a project that is fed from a 37.5kVA transformer that has a primary voltage of 7.2kV and secondary single-phase output of 240/120V. The transformer appears to be center-tapped. The impedance of the transformer is specified as 2.2%.
When calculating the maximum fault current that will be let-thru the transformer, if found that I have different scenarios leading to different fault currents:
Scenario #1: Assuming the the impedance is calculated across the 240V terminals:
Scenario 1A: The maximum fault current based on 240V is 37.5x10^3/(240*0.022)=7,102.3A
Scenario 1B: The maximum fault current based on 120V is 37.5x10^3/(120*0.011)=28,409.1A
Scenario #2:
Assuming the the impedance is calculated across the 120V terminals:
Scenario 2A: The maximum fault current based on 240V is 37.5x10^3/(240*0.044)=3,551.1A
Scenario 1B: The maximum fault current based on 120V is 37.5x10^3/(120*0.022)=14,204.6A
I appreciate if someone can guide me into the correct way for calculating the maximum transformer fault current, which voltage level and impedance to consider, and why.
Thank you
I have a project that is fed from a 37.5kVA transformer that has a primary voltage of 7.2kV and secondary single-phase output of 240/120V. The transformer appears to be center-tapped. The impedance of the transformer is specified as 2.2%.
When calculating the maximum fault current that will be let-thru the transformer, if found that I have different scenarios leading to different fault currents:
Scenario #1: Assuming the the impedance is calculated across the 240V terminals:
Scenario 1A: The maximum fault current based on 240V is 37.5x10^3/(240*0.022)=7,102.3A
Scenario 1B: The maximum fault current based on 120V is 37.5x10^3/(120*0.011)=28,409.1A
Scenario #2:
Assuming the the impedance is calculated across the 120V terminals:
Scenario 2A: The maximum fault current based on 240V is 37.5x10^3/(240*0.044)=3,551.1A
Scenario 1B: The maximum fault current based on 120V is 37.5x10^3/(120*0.022)=14,204.6A
I appreciate if someone can guide me into the correct way for calculating the maximum transformer fault current, which voltage level and impedance to consider, and why.
Thank you