understanding voltage drop

123ozzie

Member
Location
chicago illinois
Assuming the max voltage drop allowed is 3% (not here to argue the 3% number) what happens if the source voltage is above or below the "standard" 120 volts.

115v - 3% = 111.5v
128v - 3% = 124.2v

If the source voltage is high, can there be a larger % of voltage drop? 128v at 10% voltage drop is 115.2v. which is more than adequate for any load

When plugging numbers into a voltage drop calculator, should you use 120v or the actual metered voltage?

I know that many loads can run on less than 115-120v, just looking to understand the theory behind the 3% number.

Thanks in advance
 

jaggedben

Senior Member
Location
Northern California
Occupation
Solar and Energy Storage Installer
You should use the nominal voltage because that is what the utility is obligated to supply and anything you meter at a given moment is only for that moment and likely to change.

Or, if your equipment has a real problem with operating at a lower voltage but not a higher one, plan for the worst case where the voltage is lower than nominal.
 

123ozzie

Member
Location
chicago illinois
Thanks, I completely understand using the nominal voltage as a constant. Engineers can't go around with a meter to every job they design. I also understand an inspector isn't going to check the voltage of a certain load, but for arguments sake, if I know that the source voltage will always be 125v, can I except a larger voltage drop?

120v - 3% = 116.4v
125v - 7% = 116.3v

I'm simply arguing the theory here.

Thanks
 

tortuga

Code Historian
Location
Oregon
Occupation
Electrical Design
For the calculation to comply with the state of IL code (2021 IECC section C405.10 Voltage Drop) you use voltages in NEC 220.5 and a article 220 load calc as the load.
If your just doing your own calcs see ANSI C84 for voltage ranges.
 
Last edited:
Location
NE (9.06 miles @5.9 Degrees from Winged Horses)
Occupation
EC - retired
Thanks, I completely understand using the nominal voltage as a constant. Engineers can't go around with a meter to every job they design. I also understand an inspector isn't going to check the voltage of a certain load, but for arguments sake, if I know that the source voltage will always be 125v, can I except a larger voltage drop?

120v - 3% = 116.4v
125v - 7% = 116.3v

I'm simply arguing the theory here.

Thanks
Are you willing to accept the lower voltage if circumstances change? Roughly another 5 volts.
 

winnie

Senior Member
Location
Springfield, MA, USA
Occupation
Electric motor research
It really depends on _why_ you are limiting voltage voltage drop.

If the goal is meeting energy code or trying to prevent flicker when loads change, than the higher starting voltage doesn't let you use a higher % voltage drop.

On the other hand, if you have a minimum voltage that a load requires and your only goal is meeting that minimum, a higher starting voltage does give you more drop to work with.
 

don_resqcapt19

Moderator
Staff member
Location
Illinois
Occupation
retired electrician
Thanks, I completely understand using the nominal voltage as a constant. Engineers can't go around with a meter to every job they design. I also understand an inspector isn't going to check the voltage of a certain load, but for arguments sake, if I know that the source voltage will always be 125v, can I except a larger voltage drop?

120v - 3% = 116.4v
125v - 7% = 116.3v

I'm simply arguing the theory here.

Thanks
Remember that the nominal voltage and the equipment nameplate voltages are not normally the same. Where the system nominal voltage is 120, the nameplate will likely be 115. 208 nominal and nameplate of 200 240 nominal and nameplate of 230 480 nominal and nameplate of 460.
 

jaggedben

Senior Member
Location
Northern California
Occupation
Solar and Energy Storage Installer
Thanks, I completely understand using the nominal voltage as a constant. Engineers can't go around with a meter to every job they design. I also understand an inspector isn't going to check the voltage of a certain load, but for arguments sake, if I know that the source voltage will always be 125v, can I except a larger voltage drop?

120v - 3% = 116.4v
125v - 7% = 116.3v

I'm simply arguing the theory here.

Thanks

What Jon said. It really depends what your goal is.

It also depends on the type of load. For example for electronics loads nowadays, they mostly use switched power supplies that regulate their internal voltage. This means the same load supplied with a higher voltage draws *less* amps, meaning that the voltage drop is less due to both volts and amps.

Whereas for old fashioned incandescent lighting, they would draw *more* amps when supplied with higher voltage, so that partially cancels out gain in voltage. With these loads you can accept more voltage drop if your goal is to have a certain level of brightness (which is determined by the voltage at the end of the circuit). However if your goal is energy efficiency, and a 3% drop in brightness is of no consequence, then a higher supply voltage is a *bad* thing because a) the light bulbs consume more power and b) the voltage drop comes out about the same percentage, but since that's a percentage of a higher voltage, that means more energy lost to voltage drop.

Those two examples are hardly an exhaustive list of possible load behaviors.
 

123ozzie

Member
Location
chicago illinois
Thanks everyone, the reason I'm asking is because my project manager quoted a project in an existing building, using an existing empty pipe. He ran VD calculations based on a 20 amp load, 120v source, actual pipe length, and 3% VD. Existing pipe is too small. He's pissed. We started talking about "what if the feed panel meters at 123v, is more than 3% acceptable, what if the load is LED lights that operate on 100 - 277v, what if we moved the taps on the transformer feeding the source panel, etc". I was trying to come up with a standard to go by in the future.

Thanks again
 

123ozzie

Member
Location
chicago illinois
Remember that the nominal voltage and the equipment nameplate voltages are not normally the same. Where the system nominal voltage is 120, the nameplate will likely be 115. 208 nominal and nameplate of 200 240 nominal and nameplate of 230 480 nominal and nameplate of 460.
Good point. Does that mean if the name plate is 115, 115 - 3% = 111.5, and 120 - 7% = 111.6, 7% VD is acceptable? Hypothetically...
 

ggunn

PE (Electrical), NABCEP certified
Location
Austin, TX, USA
Occupation
Consulting Electrical Engineer - Photovoltaic Systems
...if your goal is energy efficiency, and a 3% drop in brightness is of no consequence, then a higher supply voltage is a *bad* thing because a) the light bulbs consume more power and b) the voltage drop comes out about the same percentage...
...and c) the light bulbs won't last as long.
 

AC\DC

Senior Member
Location
Florence,Oregon,Lane
Occupation
EC
Thanks everyone, the reason I'm asking is because my project manager quoted a project in an existing building, using an existing empty pipe. He ran VD calculations based on a 20 amp load, 120v source, actual pipe length, and 3% VD. Existing pipe is too small. He's pissed. We started talking about "what if the feed panel meters at 123v, is more than 3% acceptable, what if the load is LED lights that operate on 100 - 277v, what if we moved the taps on the transformer feeding the source panel, etc". I was trying to come up with a standard to go by in the future.

Thanks again
If he based wire in 75% he can use 90% if he uses 90% rated connectors on both ends. Done that couple times when I may have put in to small of fmc

If your equipment range is 100-277v you are obligated to bring it within that range so you may be fine.
 
Last edited:
Top