Ohms law proper usage question

Status
Not open for further replies.

mark32

Senior Member
Location
Currently in NJ
Lets say you are trying to size a transformer and cabling for a LV landscape lighting install. You have 150 watts in bulbs so 150w/12v = 12.5a but look, this transformer has multiple taps so I can bump it up to 13v so 150w/13v = 11.5a. So that makes sense, more voltage = less amps, but wait. Lets find the resistance in the circuit 12v/12.5a = .96r so let's plug in 13v's here 13v/.96r = 13.5a. Why am I getting two different answers (11.5a vs 13.5a) to what is essentially the same question. I could use 13v/11.5a = 1.13r and then 13v/1.13r = 11.5a but which formula is right? More voltage is going to push more current through the circuit but why does the resistance change with regard to the voltage being applied?
 

infinity

Moderator
Staff member
Location
New Jersey
Occupation
Journeyman Electrician
The problem is your 150 watts is the rating at 12 volts only, when you use 13 volts the wattage will also be different. You should use the resistance to calculate the current. Also when you increase the voltage in a resistive circuit you also increase the current.
 

kwired

Electron manager
Location
NE Nebraska
Think about a dual voltage transformer. Same windings for either voltage, but by connecting in series vs parallel you have changed the total impedance, and the voltage/rated current changes in direct proportion.

That same 12 volt segment of the transformer winding is still rated for same VA even if using a different tap for supplying the load at a different voltage. No matter what output voltage is used the transformer is still loaded to its rating when that 12 volt portion of the winding is carrying 12.5 amps.

If you are supplying 13 volts you are getting a little more out of the transformer if you have 12.5 amps of current. But the transformer rating is based on the load across the 12 volts tap. The idea is to try to maintain 12 volts to the load, and the additional losses are in the voltage drop in the conductors.
 

mark32

Senior Member
Location
Currently in NJ
The problem is your 150 watts is the rating at 12 volts only, when you use 13 volts the wattage will also be different. You should use the resistance to calculate the current. Also when you increase the voltage in a resistive circuit you also increase the current.

Doh, that's it. I can't believe that didn't come to mind. Thanks Rob and the others for the input.
 

Besoeker

Senior Member
Location
UK
Think about a dual voltage transformer. Same windings for either voltage, but by connecting in series vs parallel you have changed the total impedance, and the voltage/rated current changes in direct proportion.

I rather think that the more significant change would be the resistance of the bulb.
Because resistance changes, the voltage and current won't change in direct proportion.
Increase the volts and that will increase the current and thus the Watts dissipated. The bulb will run hotter. The resistance will increase.
But Ohm's law is still applicable.
 

gar

Senior Member
Location
Ann Arbor, Michigan
Occupation
EE
130604-0904 EDT

mark32:

You need to understand ohms law to use it.

It says if you have a resistor that has a constant resistance independent of the current thru the resistor that the voltage across the resistance can be represented by the equation V = I*R.

In your application you have a circuit analysis problem. This requires more than just ohms law. You need to know that the voltage around a closed loop is 0.

You have a series circuit in its simplest form that is an ideal voltage source (constant voltage), an internal transformer impedance (for simplicity assume resistance), a wire resistance from the transformer to the load, and a single load resistance (lump all your individual lamps at one point).

Thus, Vsource = Vinternal + Vwire + Vload.

Assume Vload = 12 V and as you calculated the current is 12.5 A.

Next assume a wire resistance of 0.2 ohms. The voltage drop along the wire is 0.2*12.5 = 2.5 V.

Assume another 1 V drop in the transformer, then the ideal source voltage is 12+2.5+1 = 15.5 V, and this is the open circuit voltage from the transformer. VA input to the transformer would be slightly greater than 15.5*12.5 = 194 VA. The some greater is to allow for transformer core losses.

If you have 12 V at the input to the string of lamps that I lumped all in one spot, but actually the lamps are distributed along additional wire, then the total current will be somewhat less than your calculated value when the 12 V is measured at the input end of the light string.

You really would need to analyze the light string with 12 V at the input (or some other value) to determine what the current is to obtain an accurate value. This is probably not necessary. Just the 12 V at 12.5 A is likely good enough.

As an example of how power varies for an incandescent bulb you can use the equation
P = V^1.6/21.217 for a 120 V 100 W bulb. This quite closely approximates the actual curve from 90 to 130 V input.

.
 
Last edited:

GoldDigger

Moderator
Staff member
Location
Placerville, CA, USA
Occupation
Retired PV System Designer
Lets say you are trying to size a transformer and cabling for a LV landscape lighting install. You have 150 watts in bulbs so 150w/12v = 12.5a but look, this transformer has multiple taps so I can bump it up to 13v so 150w/13v = 11.5a. So that makes sense, more voltage = less amps, but wait. Lets find the resistance in the circuit 12v/12.5a = .96r so let's plug in 13v's here 13v/.96r = 13.5a. Why am I getting two different answers (11.5a vs 13.5a) to what is essentially the same question. I could use 13v/11.5a = 1.13r and then 13v/1.13r = 11.5a but which formula is right? More voltage is going to push more current through the circuit but why does the resistance change with regard to the voltage being applied?

Now that you have gotten the current-to-bulb versus voltage straightened out, I would just like to comment that there will be resistance in the wires from the current to the lights and the taps on the transformer are there so that you can increase the voltage to make up for that and keep 12 volts at the lamps.
This will not help if one lamp is 10' from the transformer and most are 100' from the transformer. Raising the voltage will just burn out the first bulb faster. But it will help if you have all of your bulbs between 100' and 150' from the transformer, for example. Or if you run a long wire to point where you go in a star to the individual bulbs.
 

mark32

Senior Member
Location
Currently in NJ
Thanks gar and gold for stopping by. Gar, I think it was a rather amateurish question to ask on my part but I just needed that bit of info to figure out where I went wrong. I really do enjoy doing calculations but in this scenario I just wanted a quick and basic means of getting the load on this circuit, which was easily obtained, things just got confusing when I threw in the 13v's, I knew something didn't seem right. Gold, thanks for the help, I actually never intended to bump the voltage up beyond 12v's, VD really wasn't a concern here, I was just playing with the numbers.
 

gar

Senior Member
Location
Ann Arbor, Michigan
Occupation
EE
130605-0821 EDT

Some more on a tungsten filament power and current.

If you let Pr be the power rating of the bulb and this is for a standard color temperature and designed bulb at 120 V nominal over the range 90 to 130 V, then the power of a bulb that is not 100 W is

P = (Pr*V^1.6)/2121.7

and the current is

I = (Pr*V^0.6)/2121.7

For a 100 W bulb at 120 V the current is 0.8333 A.

If you play with this a little you will find that the current does not drop as rapidly as voltage.

At 90 V the bulb current is 0.7102 A a ratio of 0.7102/0.8333 = 0.842 .

But the voltage ratio was 90/120 = 0.75 . This is what you would expect from a constant resistance and ohms law.

.
 
Status
Not open for further replies.
Top