MrMoto
Member
- Location
- Eastlake, Ohio, USA
Greetings. Given that many appliance cords, extension cords, power strip cords etc. are not #12AWG as required for 20A service, but is often found to be #16AWG and even #18AWG. How do manufacturers of electrically wired products determines what gauge to use? For example, a company I am now contracting at has their control panels protected by 15A single pole circuit breakers (CB) fed from #14AWG power cords plugged into commercial 20A receptacles.
However, since there is a "rule of thumb" that the wire is to be protected at the point of supply, should not the feeder cable from the receptacle be #12AWG to the line side of the 15A CB? Case in point here is what if a wiring fault occurs between the receptacle and the CB? The affected wire between the plug and the fault is only rated for 15A. Is this really acceptable, and if so, why?
In the above application, it is assumed that #14AWG is adequate for continuous duty without any faulting upstream of the CB. I am trying to find some guidence from the NEC on the subject, but have not been able to find anything that specifically answers this question. Is this possibly addressed by the tap rule?
However, since there is a "rule of thumb" that the wire is to be protected at the point of supply, should not the feeder cable from the receptacle be #12AWG to the line side of the 15A CB? Case in point here is what if a wiring fault occurs between the receptacle and the CB? The affected wire between the plug and the fault is only rated for 15A. Is this really acceptable, and if so, why?
In the above application, it is assumed that #14AWG is adequate for continuous duty without any faulting upstream of the CB. I am trying to find some guidence from the NEC on the subject, but have not been able to find anything that specifically answers this question. Is this possibly addressed by the tap rule?