megloff11x
Senior Member
My understanding of the drill if you have differently termperature rated terminals and conductors is as follows:
If the terminal is rated at a lower temperature than the conductors, your minimu conductor size is based on ampacity at the lower rating. For example, if I have 100A into a 60C rated terminal, I must use a minimum of 1AWG wire, which is rated 110A at 30C by table 310.16.
If I use higher temperature rated wire, it must still be minimum 1AWG to draw heat off the terminal.
Now, if things get warmer, let's say 40C environment, if I was using 60C rated conductors, I would derate by 0.82 and need a conductor with an effective ampacity of 122A (100A/0.82). Thus I'd need 1/0 which goes to 125A.
However, if I was using THHN rated at 75C, my conductor derate is 0.88 for effectively 113A. This 75C column gives 2AWG wire at 115A, but my terminals still need 1AWG because they're 60C rated.
If I had these conductors bundled in a conduit run, I would need further derating there for them. In any case, I have to use the bigger of what the 60C terminal needs at ambient 30C regardless of temperature, or conductor based on temperature and bundle derating. Using a higher temperature rated conductor lets you stay thin at higher temps and bigger bundles, as long as it's thick enough for the terminal at 30C.
1. Does the terminal minimum conductor size depend on actual temperature ever, or do we just calculate that size at 30C (nominal table values)? At some point I would think the warmer terminals might want a fatter still conductor to help remove heat, but if I read the current method correctly, we don't do this. Do we ever consider bundling on heat removal as well?
2. Is this the correct way to do this? I prefer to divide the calculated load amperes by the deratings and look up at nominal 30C table values rather than multiply the whole 30C table 310.16 by the deratings and try and find it that way. It seems to go quicker my way.
Matt
If the terminal is rated at a lower temperature than the conductors, your minimu conductor size is based on ampacity at the lower rating. For example, if I have 100A into a 60C rated terminal, I must use a minimum of 1AWG wire, which is rated 110A at 30C by table 310.16.
If I use higher temperature rated wire, it must still be minimum 1AWG to draw heat off the terminal.
Now, if things get warmer, let's say 40C environment, if I was using 60C rated conductors, I would derate by 0.82 and need a conductor with an effective ampacity of 122A (100A/0.82). Thus I'd need 1/0 which goes to 125A.
However, if I was using THHN rated at 75C, my conductor derate is 0.88 for effectively 113A. This 75C column gives 2AWG wire at 115A, but my terminals still need 1AWG because they're 60C rated.
If I had these conductors bundled in a conduit run, I would need further derating there for them. In any case, I have to use the bigger of what the 60C terminal needs at ambient 30C regardless of temperature, or conductor based on temperature and bundle derating. Using a higher temperature rated conductor lets you stay thin at higher temps and bigger bundles, as long as it's thick enough for the terminal at 30C.
1. Does the terminal minimum conductor size depend on actual temperature ever, or do we just calculate that size at 30C (nominal table values)? At some point I would think the warmer terminals might want a fatter still conductor to help remove heat, but if I read the current method correctly, we don't do this. Do we ever consider bundling on heat removal as well?
2. Is this the correct way to do this? I prefer to divide the calculated load amperes by the deratings and look up at nominal 30C table values rather than multiply the whole 30C table 310.16 by the deratings and try and find it that way. It seems to go quicker my way.
Matt