There seems to be so much confusion with derating calculations I thought I would ad to it. My question is why would we calculate the termination at for example a circuit breaker based on the wire sized used rather than the maximum allowable wire size? For example I am looking at a 15 amp breaker with a 75 C rating however according to the torque setting I am extrapolating that the manufacturers listing allows up a size #4 AWG to be landed on the lug of this breaker. So could we not utilize the maximum allowable listed wire size to calculate the amperage rating of that lug rather than the wire size used to terminate on that breaker?
I will try and answer your question.
To me, heat dissipation or heat flow in its different applications per NEC, provide a foundation of protection against fire (over heating). Heat flow from a current carrying item to ambient air temperature (high to low temperature) is a fundamental NEC principle.
In understanding this heat flow, I think you can better reflect upon what the NEC is trying to do.
I believe if you can understand Ohms law, you can understand its parallel (or analog) in thermodynamics often called Ohms Law for thermal circuits. So if you can picture in your mind electric current flow, you can use the same technique in understanding heat flow.
Here is one of many links:
http://www.egr.msu.edu/~raguin/ME812/FinalProjects/Lindberg_FinalProject.htm
So how does heat move away from a lug on a breaker, for example? A simple description would be, the breaker is tested with a conductor connected to sink heat away. The lug alone contributes very little in conducting heat to ambient.