I understand the temp ratings of conductor insulation. And I understand that ampacity is limited by the lowest-rated device or conductor in a circuit. What I don't understand is, in these modern times when much (most?) of the wire we work with is THHN rated at 90?C, why are we still constrained by 60?C ampacity calculations?
In other words, why must we use the 60?C ampacity column in Table 310.16 for up to 100A or #1 conductors? Is it a standard "rule of thumb"? Or are we allowed to use the 75?C ampacity column if we can determine that both the wires and the terminations are rated at least 75?C? And, if so, then how do I determine the rating of terminations? For example, when I look at the product data sheet for this breaker, I see no temperature ratings. I find it hard to believe that devices today are still manufactured to 60?C ratings.
In other words, why must we use the 60?C ampacity column in Table 310.16 for up to 100A or #1 conductors? Is it a standard "rule of thumb"? Or are we allowed to use the 75?C ampacity column if we can determine that both the wires and the terminations are rated at least 75?C? And, if so, then how do I determine the rating of terminations? For example, when I look at the product data sheet for this breaker, I see no temperature ratings. I find it hard to believe that devices today are still manufactured to 60?C ratings.