TransistorGeek
Member
- Location
- USA
Greetings!
I have a purely resistive DC load of 325A at 10V in an industrial application. The load is approximately 50 feet from the power supply. I'm not sure how to find the appropriate wire for this size load at this distance to fit this application. The standard AWG table doesn't appear to go to this high current and the 12.4/0, which seems to be the best fit for the amperage based on a maximum 3% voltage drop, is too large and unweildy.
Does the NEC standard [FPN's to 210-19(a), 215-2(b), and 310-15] "...a maximum of 3% voltage drop for branch circuits, a maximum of 3% voltage drop for feeders, but a maximum of 5% voltage drop overall for branch circuits and feeders combined" even apply to industrial equipment?
Here are my calculations...
Single-phase DC Load:
(Conductor Resistivity)(2)(Amps)(Distance in Feet) = Wire Circular Mils
(Allowable Voltage Drop)
Conductor Resistivity = Copper; 11.2
Amps = 325A + (325A*0.2) = 390A
Distance in Feet = 50 ft.
Allowable Voltage Drop = (0.03)*(10V) = 0.3V
Thus,
(11.2)(2)(390)(50) = 1456000 circular mils = 1,456 MCM
0.3
1,456 MCM ~ 12.4/0 wire
This wire is way too huge to bend and fit in our application. Even if I don't size for 80% of the load, and use 325A in my calculation instead of 390A, I still get approx. 1,214 MCM, which is also too large. So my question is whether the 3% maximum voltage drop is really applicable in this industrial application, and how to calculate the acceptable voltage drops.
If, for example, a 5V drop would be acceptable, we could simply size our power supply for the extra 5V and go with a 1/0 wire, which is far more manageable. The heat generated by the power loss doesn't appear to be worrisome:
P_Loss = (I^2)*R
R = p(L/A) where p = resistivity of copper, L= length (m), A = area (m^2)
so,
Resistivity of copper = (1.69*10^-8)
Length = 50 ft. = 15.24 meters
Area = 0.0534705 m^2
(1.69*10^-8)[(15.24)/(0.0534705)] = ~ 4.82 uOhms (or 4.82*10^(-6))
Therefore,
(325A)^2*(4.82uOhms) = ~0.509W (power lost as heat)
So would this scenario, having a 50% voltage drop, be acceptable or would it be in violation of the NEC, UL, or IEC standard(s)? If it would be a violation, why? How can we calculate the temperatures expected as a function of time, given ambient air conditions, based on this power loss as heat?
Some might suggest rating the power supply for a much larger voltage and simply using a step-down transformer at the load, but we don't have sufficient space at the load for this to be feasible.
My primary goal here is to find a proper solution, but I'd also like to understand more about the reasoning behind it as well, so please don't be shy about getting into detail. The more information you can share the better! Thank you all in advance for your assistance!
I have a purely resistive DC load of 325A at 10V in an industrial application. The load is approximately 50 feet from the power supply. I'm not sure how to find the appropriate wire for this size load at this distance to fit this application. The standard AWG table doesn't appear to go to this high current and the 12.4/0, which seems to be the best fit for the amperage based on a maximum 3% voltage drop, is too large and unweildy.
Does the NEC standard [FPN's to 210-19(a), 215-2(b), and 310-15] "...a maximum of 3% voltage drop for branch circuits, a maximum of 3% voltage drop for feeders, but a maximum of 5% voltage drop overall for branch circuits and feeders combined" even apply to industrial equipment?
Here are my calculations...
Single-phase DC Load:
(Conductor Resistivity)(2)(Amps)(Distance in Feet) = Wire Circular Mils
(Allowable Voltage Drop)
Conductor Resistivity = Copper; 11.2
Amps = 325A + (325A*0.2) = 390A
Distance in Feet = 50 ft.
Allowable Voltage Drop = (0.03)*(10V) = 0.3V
Thus,
(11.2)(2)(390)(50) = 1456000 circular mils = 1,456 MCM
0.3
1,456 MCM ~ 12.4/0 wire
This wire is way too huge to bend and fit in our application. Even if I don't size for 80% of the load, and use 325A in my calculation instead of 390A, I still get approx. 1,214 MCM, which is also too large. So my question is whether the 3% maximum voltage drop is really applicable in this industrial application, and how to calculate the acceptable voltage drops.
If, for example, a 5V drop would be acceptable, we could simply size our power supply for the extra 5V and go with a 1/0 wire, which is far more manageable. The heat generated by the power loss doesn't appear to be worrisome:
P_Loss = (I^2)*R
R = p(L/A) where p = resistivity of copper, L= length (m), A = area (m^2)
so,
Resistivity of copper = (1.69*10^-8)
Length = 50 ft. = 15.24 meters
Area = 0.0534705 m^2
(1.69*10^-8)[(15.24)/(0.0534705)] = ~ 4.82 uOhms (or 4.82*10^(-6))
Therefore,
(325A)^2*(4.82uOhms) = ~0.509W (power lost as heat)
So would this scenario, having a 50% voltage drop, be acceptable or would it be in violation of the NEC, UL, or IEC standard(s)? If it would be a violation, why? How can we calculate the temperatures expected as a function of time, given ambient air conditions, based on this power loss as heat?
Some might suggest rating the power supply for a much larger voltage and simply using a step-down transformer at the load, but we don't have sufficient space at the load for this to be feasible.
My primary goal here is to find a proper solution, but I'd also like to understand more about the reasoning behind it as well, so please don't be shy about getting into detail. The more information you can share the better! Thank you all in advance for your assistance!
