mark32
Senior Member
- Location
- Currently in NJ
Lets say you are trying to size a transformer and cabling for a LV landscape lighting install. You have 150 watts in bulbs so 150w/12v = 12.5a but look, this transformer has multiple taps so I can bump it up to 13v so 150w/13v = 11.5a. So that makes sense, more voltage = less amps, but wait. Lets find the resistance in the circuit 12v/12.5a = .96r so let's plug in 13v's here 13v/.96r = 13.5a. Why am I getting two different answers (11.5a vs 13.5a) to what is essentially the same question. I could use 13v/11.5a = 1.13r and then 13v/1.13r = 11.5a but which formula is right? More voltage is going to push more current through the circuit but why does the resistance change with regard to the voltage being applied?