Im installing some LED strip light at my house as to become familiar with the install. A 16' strip draws 24 watts at 12vdc. According to PIE this is 2 amps. On a 15amp circuit one can power 180 watts 12vdc total. I understand according to PIE this is all that is allowed. Vs. Lighting on a 120vac circuit which allows for much more wattage. Im just finding this hard to believe considering LED lighting is to be far more efficient and energy saving. Does it truly consume 15amps quicker than 120vac lighting would? Am I to understand that if I was to power a 12vdc 30amp power supply i need to run 10gauge and protect with A 30amp breaker?
I think what you're not taking into account is which side of the driver you're talking about - line vs. load
15 amps on the load side of your driver is not anywhere close to 15 amps at the breaker
Your driver (transformer) is changing the voltage on the load side, not the line side. 180 watts at 12 volts is 15 amps on the load side, but that same 180 watts at 120 volts is only 1.5 amps on the line side. The wiring to the driver is going to be whatever you're running for all your lighting (12 or 14 gauge). But the wire from the driver to the strips needs to be sized for the load at 12 volts
Just as important as the wattage load is voltage drop.
I have installed LED strip on numerous occasions, the most recently in a menu board at a fast food restaurant. I NEVER run a full strip in a single piece, because by the time you add cable length plus the length of the strip (16 feet), you have the potential for quite a bit of voltage drop at the end of the strip. I cut strips in half (8ft. long).
I would not install 7 or 8 strips off a single feed, either. If you run a circuit of only 35 ft., you would need a 2-guage wire to carry 360 watts (30 amps) with less than 3% voltage drop. Yes, a 2-gauge wire.
3% may not seem like much, but that's about 1/2 volt, which can adversely affect lighting performance.