Can someone please explain to me why we use the 75 degree column to size wire? Thhn is what we typically use and it is rated for 90 degrees c. I understand that almost all terminal s have 75 degree rating but how does that affect the ampacity of the wire itself? After all isn't it the ampacity that would cause heat? So what's the ampacity of the 75 degree terminal on a 20 amp receptacle ? Probably that terminal is rated for 20 amps even at 75 degrees. Could anyone please clear this up for me. Thanks