Motor Rating: 460 v 480

Status
Not open for further replies.

mityeltu

Senior Member
Location
Tennessee
I want to understand first why motors are rated at 460 if they are seldom on 460v systems.

I then want to understadn what impact this has on the actual function of the motor. As I understand it, my 460v RATED motor will pull fewer starting amps at 480v, right? With lower starting current then do I have lower torque? If so, for a particular given load, would I then need to upsize the motor to start the load?
 

GoldDigger

Moderator
Staff member
Location
Placerville, CA, USA
Occupation
Retired PV System Designer
An increase in voltage will lead to an increase in starting amps and torque, not a decrease.

But once the motor is running at full speed, the amps drawn for a particular power output will go down as the line voltage goes up. Thus leading some to get confused about starting as compared to running conditions.
 

GoldDigger

Moderator
Staff member
Location
Placerville, CA, USA
Occupation
Retired PV System Designer
I want to understand first why motors are rated at 460 if they are seldom on 460v systems.

To some extent, it is just a naming convention and an attempt to "harmonize" motors for both US and non-US system voltages. Your 460 volt motor will have a specified range of allowed input voltages which includes the high side voltage of a 480 volt system tolerance and goes down well below 460 on the low side.
 

mityeltu

Senior Member
Location
Tennessee
If I convert my motor hp to kva and think of it as a black box, then power in must = power out. If voltage increases, current must decrease.

Is this not the case?
 

GoldDigger

Moderator
Staff member
Location
Placerville, CA, USA
Occupation
Retired PV System Designer
If I convert my motor hp to kva and think of it as a black box, then power in must = power out. If voltage increases, current must decrease.

Is this not the case?

Yes, but not in the case of starting, where the power out will NOT be a constant. As was said, the torque will be higher and the acceleration greater with the higher starting voltage. Thus the constant power assumption does not apply.

If you actually change the winding taps on the motor for a different voltage, then the starting current will behave the way you expect. It is just when the motor is unchanged and the voltage varies that the acceleration and maximum output power change.
 
Last edited:

mityeltu

Senior Member
Location
Tennessee
OK, I can see this. At t=0+ the impedance of the motor is wire impedance (high current). As the magnetic field gets established (t=0++) the impedance increases thereby decreasing the current.

Ok. So, I thought I read in NEMA MG-1 (I don't have it in ront of me) that as the voltage decreased (when running) that the current also decreased.
 

david luchini

Moderator
Staff member
Location
Connecticut
Occupation
Engineer
If I convert my motor hp to kva and think of it as a black box, then power in must = power out. If voltage increases, current must decrease.

Is this not the case?

No, it is not the case. Motors generally have a tolerance band for higher voltages, but above that, higher voltages will lead to an INCREASE in load current.
 
I want to understand first why motors are rated at 460 if they are seldom on 460v systems.

I then want to understadn what impact this has on the actual function of the motor. As I understand it, my 460v RATED motor will pull fewer starting amps at 480v, right? With lower starting current then do I have lower torque? If so, for a particular given load, would I then need to upsize the motor to start the load?

The voltage delivered at the motor terminal is averaged at 460V. The voltage 'delivered' at the distribution point is averaged at 480V. It is impossible to deliver the same voltage to the use point as at the distribution point since there is always some distance involved that results in voltage drop.

It is also an impractical - and prohibitively expensive - to require the utility to deliver 480V to your distribution point, no matter what. So there is an operating window of acceptable +/- voltage range that is allowed for the utilities. Following that electrical equipment, motors included, are designed to operate within a voltage range, usually +10/-15%, and deliver rated performance.
 

GoldDigger

Moderator
Staff member
Location
Placerville, CA, USA
Occupation
Retired PV System Designer
No, it is not the case. Motors generally have a tolerance band for higher voltages, but above that, higher voltages will lead to an INCREASE in load current.
The tolerance band gives you the range over which the motor will deliver at least its rated power and not draw more than its rated current figures, but the actual way in which the voltage and current values interact will just the same outside that band as inside it. But the motor may not be happy.
 

david luchini

Moderator
Staff member
Location
Connecticut
Occupation
Engineer
The tolerance band gives you the range over which the motor will deliver at least its rated power and not draw more than its rated current figures, but the actual way in which the voltage and current values interact will just the same outside that band as inside it. But the motor may not be happy.

I don't understand what you are saying.

But once the motor is running at full speed, the amps drawn for a particular power output will go down as the line voltage goes up. Thus leading some to get confused about starting as compared to running conditions.

This is not correct. Both undervoltage and overvoltage will lead to an increase in load current.
 

GoldDigger

Moderator
Staff member
Location
Placerville, CA, USA
Occupation
Retired PV System Designer
This is not correct. Both undervoltage and overvoltage will lead to an increase in load current.

The key phrase in my comment was "for a given power output", and assumes that the motor is already running at rated speed.
If the motor is loaded to 1/2 of its rated power, and the load is something like a dynamometer which will keep that constant power load independent of motor speed, then the current will decrease with increasing voltage and increase with decreasing voltage, as long as you are looking only at the resistive part of the current. If the load on the motor is something which is proportional to speed (or speed cubed like a fan or pump load) and the motor starts out with a fairly high slip, then increasing the voltage will increase the speed, which will increase the power by the third power of the speed, which may cause the current to increase.
Another complication to this is that the purely inductive component of the motor current (the zero PF part, as it were) will decrease with decreasing voltage and increase with increasing voltage, assuming a constant motor speed and slip.
 

scottmarston

Member
Location
Meridian, ID
The voltage delivered at the motor terminal is averaged at 460V. The voltage 'delivered' at the distribution point is averaged at 480V. It is impossible to deliver the same voltage to the use point as at the distribution point since there is always some distance involved that results in voltage drop.

It is also an impractical - and prohibitively expensive - to require the utility to deliver 480V to your distribution point, no matter what. So there is an operating window of acceptable +/- voltage range that is allowed for the utilities. Following that electrical equipment, motors included, are designed to operate within a voltage range, usually +10/-15%, and deliver rated performance.

+1 on this. You'll never get 480VAC from your distribution out to the motor unless the distance is very very short. So I believe they derate the motor windings to 460VAC as an average to be closer to nominal value for typical installation applications.
 

Besoeker

Senior Member
Location
UK
then increasing the voltage will increase the speed, which will increase the power by the third power of the speed, which may cause the current to increase..
Not by a lot. In normal operation most cage induction motors that I deal with run at about 1% slip. I just happen to have a have a data sheet for an ABB motor on my desk. It's 55kW motor and rated speed is 1485 rpm. 1% at 50Hz. Real world data.
Increasing voltage within normal supply limits would have very little impact on speed. It, quite obviously, can't be more than a 1% increase - the motor can't run above synchronous speed.And at sync speed it produces no torque. So we are looking at a very small speed change. Whether it increases or decreases the current depends on loading. At light loads more volts is generally more amps. At heavy load, it's generally the reverse.
At what point depends on motor design.
 

robbietan

Senior Member
Location
Antipolo City
The key phrase in my comment was "for a given power output", and assumes that the motor is already running at rated speed.

Another complication to this is that the purely inductive component of the motor current (the zero PF part, as it were) will decrease with decreasing voltage and increase with increasing voltage, assuming a constant motor speed and slip.

at rated speed with constant kW motors, running the motor at a higher voltage than its nameplate will mean less current. also means motor will be less hot due to reduced copper losses. running the motor at less than rated voltage means higher amps. and a hotter motor due to increased copper losses
 

jim dungar

Moderator
Staff member
Location
Wisconsin
Occupation
PE (Retired) - Power Systems
I want to understand first why motors are rated at 460 if they are seldom on 460v systems.

The answer is: because we have always done it that way.

Since at least the 1930's, or so, utilization equipment voltage ratings have been based on standard practices within their specific industry organizations. Motor manufacturers decided to rate their equipment for a voltage slightly less than nominal supplied voltages. It seems the difference has been a fairly constant 10V difference for <300V and 20V difference for >300V. When our nominal systems were 230/460V, back in the late '40 to early 50's, motors were listed as 220/440V.
 

kwired

Electron manager
Location
NE Nebraska
As I understand it, my 460v RATED motor will pull fewer starting amps at 480v, right?

An increase in voltage will lead to an increase in starting amps and torque, not a decrease.

This is exactly why we use reduced voltage starting methods sometimes, soft starters, part winding start, wye-delta starting schemes all reduce starting voltage (or at least where it is applied to) as well as the starting current.
 

broadgage

Senior Member
Location
London, England
The use of 460 volt nameplate motors on 480 volt nominal supplies is a USA convention, and rather an odd one IMHO.
It is certainly true that the average voltage at the motor may be nearer 460 than 480, due to voltage drop in the wires, but in that case why not apply similar conventions to other loads.

It is not usual to specify 460 volt ballasts for HID lamps on nominal 480 volt circuits, likewise filament lamps intended for 120 volt circuits are not normally 110 volt, but are 120 volt.

If a 120 volt supply was needed from a nominal 480 volt service, it would be usual to specify a transformer with a 480 volt primary, yet the extract fan in the same room and perhaps on the same panel would be rated 460 volts.
 

kwired

Electron manager
Location
NE Nebraska
The use of 460 volt nameplate motors on 480 volt nominal supplies is a USA convention, and rather an odd one IMHO.
It is certainly true that the average voltage at the motor may be nearer 460 than 480, due to voltage drop in the wires, but in that case why not apply similar conventions to other loads.

It is not usual to specify 460 volt ballasts for HID lamps on nominal 480 volt circuits, likewise filament lamps intended for 120 volt circuits are not normally 110 volt, but are 120 volt.

If a 120 volt supply was needed from a nominal 480 volt service, it would be usual to specify a transformer with a 480 volt primary, yet the extract fan in the same room and perhaps on the same panel would be rated 460 volts.

IMO the nameplate data is only accurate when the applied voltage measures 460 volts, and the connected output load is such it draws nameplate amps. Only then should speed, efficiency, power factor, or any other data marked on the nameplate match what is measured.

Same for the lamp. A 100 watt 120 lamp should only draw 100 watts if 120 volts is applied. With the phasing out of incandescent lamps I'm not sure what is all available anymore, but you used to be able to purchase 130 volt rated lamps but you were going to suffer a little on light output if you only applied 120 volts, but they were preferred because they had a significant increase in lamp life if less than 130 was applied.
 
Status
Not open for further replies.
Top