VD Calc

In this case, you first have to ask 'how accurate do I need to be?'

The accurate way to do this is to figure out the total current for all of the lights, and use this with the distance to the first light, to calculate the voltage drop only for the first segment of wire (from the source to the first light). Then you calculate the total current of lights 2...15, and calculate the voltage drop from the first light to the second light. At the second light you have _two_ voltage drops that you have to add up to get the total voltage drop at that point. Then for the third light you calculate the total current of lights 3...15, calculate the voltage drop from light 2 to light 3, and add that in. And so on till the end of the string. If you really want to be precise, you have to remember that the current to each light changes as the applied voltage changes.

The best way to do this is with a spreadsheet, where you stack all of the 15 separate voltage drop calculations.

But usually this isn't necessary.

For example, if you assume that all of current drawn by all of the lights was at the end of the run, then you will have a much worse voltage drop then reality but a simpler calculation. If the voltage drop is acceptable in this simplified calculation, then you don't bother with the more precise calculation.

Slightly more complicated is to assume the total current of all lights is taken at the _average_ from the source to the light fixtures.

Only if the results of the simple calculation are borderline is it worth doing a more complicated calculation.

-Jonathan
 
I'm a littler lost. So if each fixture is 50 watts I'd do what?

First you need to somehow figure amps rather than watts.

This might require looking at the datasheet for the fixture, because 'power factor' mucks up the relationship between supply voltage, watts, and amps. The simplest calculation (watts / volts = amps) assumes perfect power factor, which might be a close enough approximation.

Once you have amps per fixture you can proceed with the voltage drop calculation.

Answers #1-#4 (including my own) missed that you might be having trouble going from watts to amps.
 
I've always done what winnie suggests in post #5
For example, if you assume that all of current drawn by all of the lights was at the end of the run, then you will have a much worse voltage drop then reality but a simpler calculation. If the voltage drop is acceptable in this simplified calculation, then you don't bother with the more precise calculation.

You can play with the numbers on a voltage drop calculator and see how detailed you need to be.
For example, in your case, (15) 50 watt fixtures at 120v would allow you about 250 ft on a #12 if the load was at the end.
That way, if your length is 250 or less, you know a #12would be sufficient without getting into more detail
(You really need the actual load in amps for each fixture to play with the numbers)
 
I'm a littler lost. So if each fixture is 50 watts I'd do what?
Convert it to amps. Amps = Watts/ Volts.
I usually like to start at the last fixture and work my way back to the source;

Amps * distance * "VoltDrop per amp-foot" for first light and add each segment using the current and length for each segment.

Here is an example: 120 Watt fixtures, 120 Volts each, so 1 amp per light. Assume they are 100' apart with #12 wire. Assume there are 10 lights.

My square d calculator gives 0.37 volts drop per amp per 100' for #12, or .0037 per amp-foot.

1 amp * 100' * .0037 + 2 amps * .0037 * 100' + 3 amps * .0037 * 100'.....

After thinking about it, you will have .37 * (1+2+3+4....10) or .37* 55 = 20.4 volts drop at the last light if I didn't make a mistake.

We would problaby want larger wire for that run.
 
Last edited:
Convert it to amps. Amps = Watts/ Volts.
I usually like to start at the last fixture and work my way back to the source;

Amps * distance * "VoltDrop per amp-foot" for first light and add each segment using the current and length for each segment.

Here is an example: 120 Watt fixtures, 120 Volts each, so 1 amp per light. Assume they are 100' apart with #12 wire. Assume there are 10 lights.

My square d calculator gives 0.37 volts drop per amp per 100' for #12, or .0037 per amp-foot.

1 amp * 100' * .0037 + 2 amps * .0037 * 100' + 3 amps * .0037 * 100'.....

After thinking about it, you will have .37 * (1+2+3+4....10) or .37* 55 = 20.4 volts drop at the last light if I didn't make a mistake.

We would problaby want larger wire for that run.
I think I would start at the source end.
 
I think I would start at the source end.
The answer is the same, so do whichever it's easier to wrap your mind around.

Its (1+2+3...10) vs. (10+9+8...1).

Starting at the load end and adding the currents as the load increases might be a little more straight forward to do in a spreadsheet.
 
Amps * distance * "VoltDrop per amp-foot" for first light and add each segment using the current and length for each segment.
This works as long as the run is all one conductor size. If you have more than one conductor size in the run, you need to do each segment separately.

Also for the case of one conductor size, you can equivalently add up for each load (amps drawn * distance from the source).

Cheers, Wayne
 
Top