shespuzzling
Member
- Location
- new york
Can somebody please help explain why we are allowed to use the 90 deg. column when derating conductors? My understanding of Table 310.16 is that for a given wire size (say 6 AWG), the temperature rise associated with a particular amount of current is listed in the 60, 75 and 90 degree columns. So, a 6 AWG wire with 65A flowing through it at ambient temperature will rise to 75 degrees.
I will use one of the examples in the 2008 NEC handbook to explain my question:
Total value of noncontinuous load = 175A
Equipment and equipment terminals rated at 75 degrees
Temperature =112 deg
Correction factor for a temperature of 112 deg for 90 deg rated insulation = .82
Use Aluminum XHHW-2 wire
This is how the NEC example goes:
The required conductor ampacity is calculated to be 175A*.82 = 213A. A 250MCM 90 deg wire size is selected as it has a rating of 230A at 90 deg. At 75 degrees, the 250MCM conductor has a rating of 205A, which is greater than the noncontinuous load of 175A so it is deemed acceptable. But if the adjusted load ampacity is 213A, which is greater than 205A, wouldn't the temperature of the conductor would be over 75 degrees and thus not compatible with the terminal ratings?
Alternatively, if you take a 250MCM 90 deg wire (230A) and multiply it by the derating factor (.82) you get 188A. So now if you have a 188A load then the temperature of the wire will rise to 90 degrees. By the same logic, that same wire will need only 205A*.82=168A before it reaches 75 degrees. Since the load is 175A, the wire temp would be greater than the terminal temperature ratings.
Are both of these methods for solving for derating wire size acceptable? Why is it okay that at full load (175A) the temperature of the wire will be greater than 75 degrees?
I will use one of the examples in the 2008 NEC handbook to explain my question:
Total value of noncontinuous load = 175A
Equipment and equipment terminals rated at 75 degrees
Temperature =112 deg
Correction factor for a temperature of 112 deg for 90 deg rated insulation = .82
Use Aluminum XHHW-2 wire
This is how the NEC example goes:
The required conductor ampacity is calculated to be 175A*.82 = 213A. A 250MCM 90 deg wire size is selected as it has a rating of 230A at 90 deg. At 75 degrees, the 250MCM conductor has a rating of 205A, which is greater than the noncontinuous load of 175A so it is deemed acceptable. But if the adjusted load ampacity is 213A, which is greater than 205A, wouldn't the temperature of the conductor would be over 75 degrees and thus not compatible with the terminal ratings?
Alternatively, if you take a 250MCM 90 deg wire (230A) and multiply it by the derating factor (.82) you get 188A. So now if you have a 188A load then the temperature of the wire will rise to 90 degrees. By the same logic, that same wire will need only 205A*.82=168A before it reaches 75 degrees. Since the load is 175A, the wire temp would be greater than the terminal temperature ratings.
Are both of these methods for solving for derating wire size acceptable? Why is it okay that at full load (175A) the temperature of the wire will be greater than 75 degrees?