On the same topic: Does anyone else find the "increased in size" requirement frequently skipped/forgotten for Inverter output circuits? I see it all the time in systems that were not designed by us. Is there some "loophole" like perhaps a creative use of "ambient temperature on the roof" or bundling? Right now we are doing a 400kw rooftop and using tray cable. The inverter output is about 35 amps and I cant see how they can use tray cable. I didnt run any numbers, but there are several different sizes so at least some of it sized for VD.
NEC2014 specifies that EGC is only upsized when you increase the size above "the minimum size that has sufficient ampacity for the intended installation". Previous editions of the NEC were unclear on this issue.
In otherwords, if temperature corrections or bundling ampacity adjustments are the reason you increase the size of the main wires, you therefore don't need to increase the EGC above the default size. There is a proposal for NEC2017, to undo this clarification, and still require it when you increase size because of derate factors.
The main intention of this is voltage drop, to make sure you have enough of an effective ground-fault current path, when resistance increases with distance.
There are other reasons you might increase the main wires, for which I don't think you should have to increase the EGC. However you do, according to the 2014 NEC.
1. Using larger wire left over from your previous job.
2. Standardizing on a minimum size, such as #10, and using it even where #14 could suffice. Perhaps because in your scope of work, you rarely work with anything smaller that 30A.
3. Having to use a larger size than normally necessary, because equipment terminals aren't capable of receiving the minimum size per the NEC ampacity calculations. I've seen equipment with terminals for #8 minimum, even though the device has a full load amperes of 14 A (20A OCPD, therefore #12 default).