Re: ambient temp correction factors 310.16
Originally posted by iwire: They do not say minimum ambient either, but by your logic I could use the lowest temperature the conductors will be exposed to and figure from there?
That is not my logic. In fact, I am not relying on logic, but rather on physics. Ampacity limits exist for one purpose only: to protect the cable?s insulation system. Too much current means too much heat generated within the wire, with the result that the insulation gets subjected to too high a temperature.
Consider the following as a laboratory experiment (i.e., no load calcs, no 80% rule, no inspections, no NEC; just a lab coat, a test oven, and a thermometer). If you were to run 110 amps continuously (i.e., for decades on end) through a #1 type TW wire in a room that is maintained (throughout all those decades) at a constant 30C, the I*2R heat generation would raise the temperature of the wire no more than 30 degrees above the ambient 30, with the result that the insulation system would be subjected to no more than the 60C for which it is rated. You could run the same experiment on a #1 type THW, but this time run 130 amps through it (again, for decades on end). The I*2R heat generation would raise the temperature of the wire no more than 45 degrees above the ambient 30, with the result that the insulation system would be subjected to no more than the 75C for which it is rated. Run 150 amps through a #1 type THHN, and its insulation system would be subjected to no more than the 90C for which it is rated. That is the meaning of the ampacity limits given in Table 310.16.
As temperature is raised, the rate at which the insulation system degrades increases ?exponentially,? a term that I will loosely translate as ?faster than linear.? Consider, for example, a type TW cable. Suppose for the moment that a manufacturer guarantees that its insulation system will be good for a 40 year life. That means that you could keep it at a constant 60C (through any combination of ambient temperature and I*2R heat generation), and it would maintain an adequate insulation resistance for the entire 40 years. Now raise the temperature to 70 degrees. Will it fail instantly? No. But it would only last about 20 years, before its insulation system had degraded to the point that it was no longer adequate. Now raise the temperature to 80 degrees. Will it fail instantly? No. But it would only last 10 years at that temperature. Raise the temperature to 90 degrees, and it would last only 5 years. Raise the temperature to 100 degrees, and it would last only 2.5 years. Keep going at higher and higher temperatures, and eventually the insulation?s useful life will be limited to a few seconds, as it cooks, cracks, or even catches fire.
Now here?s the point: If a cable is at 31C (88F) for one summer month, it will degrade faster than it would have at 30C; it will have lost some of its 40 year life. But if you then keep it below 20C (68F) for most of the Fall and Spring and all of the Winter, during these times it will degrade slower than it would have at 30C. The result is that it could have its useful life extended beyond the original 40 years.
So yes, the use of ?average ambient? makes sense, from a physics perspective. Now I would like the NEC to say that, so that we don?t have to guess or argue, and so that there will be no question of non-compliance or enforcement.