Voltage Drop: 115V equipment on 120V power

Status
Not open for further replies.

Jon456

Senior Member
Location
Colorado
Service is 208Y/120V. I am doing voltage drop calcs and it seems to me that if the equipment specs are listed for 115V, then I have an extra 5V of voltage drop to work with.

For example, I have a fan with the following manufacturer specs:
  • 115V
  • 1-Phase
  • 11.6 FLA
  • 1 HP

From Table 430.248, the FLC at 115V is 16A.

The one-way circuit distance from the distribution panel to the fan is 100 ft.

So I plug the following into my voltage drop calculator:
  • 100 ft.
  • 120V
  • 16A Current
  • 1 Set of 12AWG conductors

I get a voltage drop of 5.56V (4.63%). But since my service is 120V, I should have 120 - 5.56 = 114.44V actual voltage at the fan. Since the fan is rated at 115V, then my corrected voltage drop is 115 - 114.44 = 0.56V (0.5%).

Are my logic and calculations correct?
 

charlie b

Moderator
Staff member
Location
Lockport, IL
Occupation
Retired Electrical Engineer
I didn't check your math. But your logic works for me. That is the reason the mechanical ratings are lower than our ratings (e.g., 460V motors on our 480V system).
 

MAC702

Senior Member
Location
Clark County, NV
I doubt they intended their specifications to be that precise. The worldwide standard is to label equipment for 115 and 230 V. They know the former is used on predominately 120 V American service, but it keeps the label exactly half of the 230 V standard that they label equipment for the entire world, knowing it is used in countries that deliver 220, 230, and 240.

Yes, your equipment will work. But in reality, you put way too much thought into it.

You shouldn't plan VD for what will work with your equipment, but in what is an acceptable loss from your service voltage. The rest just works itself out, and will always be good no matter what equipment is used in the future.
 

Jon456

Senior Member
Location
Colorado
You shouldn't plan VD for what will work with your equipment, but in what is an acceptable loss from your service voltage. The rest just works itself out, and will always be good no matter what equipment is used in the future.
Except the VD is dependant, not only on distance from the supply, but also the amount of current being drawn. So you really do have to design for the equipment that is being installed now unless your going to size every conductor to the branch circuit's maximum ampacity.
 

MAC702

Senior Member
Location
Clark County, NV
Except the VD is dependant, not only on distance from the supply, but also the amount of current being drawn. So you really do have to design for the equipment that is being installed now unless your going to size every conductor to the branch circuit's maximum ampacity.

Yes, but I'm not talking about to calculate it. I'm saying that whatever your load is, you should strive for a VD goal in relation to your service voltage, not in relation to the voltage tolerance of the currently-installed equipment.
 

gar

Senior Member
Location
Ann Arbor, Michigan
Occupation
EE
190704-1040 EDT

Jon456:

I suggest your logic is wrong.

What is the definition of voltage drop? For the purposes here I will suggest that it is the change in voltage at the load end from no load to loaded.

Using this definition, if I have an ideal voltage source of 100 V with an internal impedance of 1 ohm resistance, and I place a 9 ohm resistance at the load end, then voltage drop is 100*(1-9/10) = 10 V. Change the source voltage to 120 V and we have 120*(1-9/10) = 12 V.

Put inductance in the circuit and it is not so simple. Now you need phasor (vector) analysis. However, in many cases you will ignore inductance.

What is of most importance is that actual voltage at the load is not below a required minimum, and not above some maximum. Thus, voltage drop alone is not the most important factor. Also you must be concerned with where your actual supply voltage is, and what is its internal impedance.

See some plots of my main panel voltage at
http://beta-a2.com/energy.html
The source data has 1 second resolution.

Today in many areas you may find the normal panel voltage up near 125 V for a nominal 120 V system.

With no load on your panel (an approximation proportional to the power company primary supply voltage), a hot day with many neighboring air conditioners on, then your panel voltage may be below 120 V. In other words an indication of what your pole supply primary voltage is.

If your supply voltage usually runs around 110 V instead of 120 V, then you can not tolerate as much voltage drop to the load as when when the supply is 120 V.

.
 

paulengr

Senior Member
You are confusing distribution and utilization voltages. IEEE and ANSI and IEC for that matter give for instance 120 V as a distribution voltage so that's what we adjust the transformer to. The equipment is rated about 5% less so 115 V or say 480/460. At medium voltage it tightens up to 3%. Either way voltage tolerance is +10/-15% by UL, NEMA, and IEEE standards so there is wide tolerance built in. The dual voltage standard is confusing to many people though so I've seen motor labels at 440 (actually another standard voktage used outside North America), 460, and 480 when on North American motors they should all be 460. Control and lighting and residential should all be 115 but some is labelled 110. The worst one is the 240 V single phase Edison/GE style split phase voltage. Single phase is usually marked 220 but 3 phase motors are usually marked 230 and both are so close to 208 that we cn often stretch things to work when technically it shouldn't especially when 208/120 has the wrong phase angle relationship so it's not technically within the -15% envelope of 220/115 single phase. Plus it should be obvious that if the distribution is 240/120 and we call it 220 and not 230 which a lot of labels do, half of 220 is 110, not 115. Also the full ANSI standard references some numbers such as 115 AND 110 as part of the standard but one is recommended and both are accepted.


Sent from my SM-T350 using Tapatalk
 

LarryFine

Master Electrician Electric Contractor Richmond VA
Location
Henrico County, VA
Occupation
Electrical Contractor
Over the years, I've seen 110, 112, 115, 117, 118, and 120v ratings on various devices and appliances.
 

Jon456

Senior Member
Location
Colorado
If your supply voltage usually runs around 110 V instead of 120 V, then you can not tolerate as much voltage drop to the load as when when the supply is 120 V.
Doesn't this statement confirm my logic?

Let's look at an exaggerated hypothetical: We have 120V (actual) supply and a 1,000 foot branch circuit of 12 AWG wire. The load is 10A. Voltage drop would be 34.7V. If our load required 120V, then we would need to install 1 AWG wire. But what if our load only required 60V? Wouldn't it be absurd to install 1 AWG wire when 14 AWG would suffice?

The only point I'm still unsure of is, are we to take the manufacturer's 115V specs literally? One would think there's a valid reason some equipment is spec'd at 120V and others at 115V besides sloppy interpretation of US standard voltages.
 

gar

Senior Member
Location
Ann Arbor, Michigan
Occupation
EE
190704-1938 EDT

Jon456:

This was your statement
I get a voltage drop of 5.56V (4.63%). But since my service is 120V, I should have 120 - 5.56 = 114.44V actual voltage at the fan. Since the fan is rated at 115V, then my corrected voltage drop is 115 - 114.44 = 0.56V (0.5%).
The voltage drop definition I defined in my post was based on the change in no load to loaded voltage at the load.

So if your no load voltage is 120,
(you only used 120 as a nominal voltage, not the actual open circuit voltage, but possibly near a correct value, as I am willing to assume is somewhat likely the actual open circuit voltage value)
then this is what you reference your voltage drop from.

The term "corrected voltage drop" makes no sense, and is in no way useful.

.
 
Location
NE (9.06 miles @5.9 Degrees from Winged Horses)
Occupation
EC - retired
Doesn't this statement confirm my logic?

Let's look at an exaggerated hypothetical: We have 120V (actual) supply and a 1,000 foot branch circuit of 12 AWG wire. The load is 10A. Voltage drop would be 34.7V. If our load required 120V, then we would need to install 1 AWG wire. But what if our load only required 60V? Wouldn't it be absurd to install 1 AWG wire when 14 AWG would suffice?

The only point I'm still unsure of is, are we to take the manufacturer's 115V specs literally? One would think there's a valid reason some equipment is spec'd at 120V and others at 115V besides sloppy interpretation of US standard voltages.
I don’t take it literally.

We connected some fancy milling equipment at a machine shop that required xxx voltage, that we took seriously.
 

Jon456

Senior Member
Location
Colorado
The measured service voltage is 210V phase-to-phase and 121.5V between each phase conductors and the grounded conductor (+/-0.5V on all measurements).
 

ActionDave

Chief Moderator
Staff member
Location
Durango, CO, 10 h 20 min from the winged horses.
Occupation
Licensed Electrician
The only point I'm still unsure of is, are we to take the manufacturer's 115V specs literally? One would think there's a valid reason some equipment is spec'd at 120V and others at 115V besides sloppy interpretation of US standard voltages.

99.99999% of all of the world and 98.999% of all electricians go through every day of their life not worrying about it.
 
Jon, one comment. I would use (or estimate) actual current for my VD calcs. This would mean not using the NEC table value which is artificially high. In fact, I would often use less than nameplate for a motor if it's an application where it will likely never be fully loaded.
 

gar

Senior Member
Location
Ann Arbor, Michigan
Occupation
EE
190704-1958 ECT

Jon456:

Let's look at an exaggerated hypothetical: We have 120V (actual) supply and a 1,000 foot branch circuit of 12 AWG wire. The load is 10A. Voltage drop would be 34.7V. If our load required 120V, then we would need to install 1 AWG wire.
#1 won't solve your problem.

But what if our load only required 60V? Wouldn't it be absurd to install 1 AWG wire when 14 AWG would suffice?
Would you get 60 V at the load if at 60 V the load required 20 A? #14 copper is about 5 ohms for 2000 feet.

The only point I'm still unsure of is, are we to take the manufacturer's 115V specs literally? One would think there's a valid reason some equipment is spec'd at 120V and others at 115V besides sloppy interpretation of US standard voltages.
You need to know much more about a load than that it has some nominal voltage rating. Generally I would design equipment to work over the range of 95 to 135 V. Most industrial plants tend to run over nominal voltage in a control cabinet. The overhead main bus bars are usually 480 V ungrounded delta. Not likely to be under voltage. Then the stepdown transformers in the cabinet are usually oversized. Thus, a high nominal 120 V supply, likely 130 V inside the cabinet. Industrial plants need to keep running even if the power company may have problems keeping their primary voltage up. Costs for downtime at a large plant may run $ 500,000 per hour.

.
 

Jon456

Senior Member
Location
Colorado
99.99999% of all of the world and 98.999% of all electricians go through every day of their life not worrying about it.
That's sort of what I've been thinking. In reality, would anyone object to running 110V or 115V or 120V rated equipment on 115V measured voltage?

Mind you, I'm talking about fans and appliances in a commercial setting. Not million dollar lab equipment or multi-axis CNC machines.
 

Jon456

Senior Member
Location
Colorado
#1 won't solve your problem.
As explained, it was an exaggerated hypothetical to illustrate voltage drop in two different cases. Not a real-world example. That said, why wouldn't 1 AWG wire work in the example given?

Would you get 60 V at the load if at 60 V the load required 20 A?
I gave parameters for two different loads; it wasn't intended to be the same load under different conditions. One load was 120V at 10A and the other load was 60V at 10A. Take those as nameplate ratings.
 

Jon456

Senior Member
Location
Colorado
Jon, one comment. I would use (or estimate) actual current for my VD calcs. This would mean not using the NEC table value which is artificially high. In fact, I would often use less than nameplate for a motor if it's an application where it will likely never be fully loaded.
This is helpful advice. But don't you need to account for start-up demand? If the VD is too high during start-up, the motor labors getting up to speed which can be damaging.
 
This is helpful advice. But don't you need to account for start-up demand? If the VD is too high during start-up, the motor labors getting up to speed which can be damaging.

Certainly may be a concern in certain situations, but consider this: I have a rather convoluted electrical system feeding my house that has a step up then step down transformer with 1900 feet of wire between then, 60 feet of wire on each side of the transformers. All those drops add up. I get about a 10 volt rise when my PV system puts out 60 amps. I have this air compressor that I used to have a real hard time starting on a 6000 Watt generator. Couldnt have any sort of extension cord and you had to rev up the throttle on the genny a bit. Starts instantly on this system. Just a single anecdote of course, but just saying usually not a problem even with rather high VD.
 

gar

Senior Member
Location
Ann Arbor, Michigan
Occupation
EE
190704-2159 EDT

Jon456:

If your source voltage is 120 V and your load requires 120 V as you stated, then you can not tolerate any voltage drop. Any load current will produce some voltage drop, thus there is no wire size that will solve your problem. Create a realistic example.

You did not say 60 V at 10 A. It may have been your assumption, but was not stated.

.
 
Status
Not open for further replies.
Top