Single Phase Motor Amp Vs Voltage

Status
Not open for further replies.

HVAC Guy

Member
Location
United States
Data plate info on a particular motor shows 208/230 volts, 1.2 FLA. As I understand things, a ?10% variation in voltage is allowed, so the acceptable operating range of voltage could be anywhere from 188-253V. I know, generally speaking, at a particular load, the current draw will increase with a decrease in applied voltage, to maintain the power (P=VI). Based on the same reasoning, increases in voltage would lower the amp draw...

I read somewhere else, "over voltage" on a motor can/will produce a higher current, due to "magnetic saturation" (yeah, sure :huh:), and it seemed to suggest the amount of over-voltage could be within the +10% margin...

That's the question: if the motor is actually operating on a (measured) 251V 1? supply, will it pull more or less than the "rated" FLA?

Thanks in advance...:)
 
That's the question: if the motor is actually operating on a (measured) 251V 1? supply, will it pull more or less than the "rated" FLA?
There is no "one size fits all" answer.
If the motor is operating within the voltage tolerance range it should not saturate.
If the motor is running totally unloaded, yes I know not likely, higher voltage usually means higher no load current.
At rated load it would depend on the motor characteristics.
 
Last edited by a moderator:
There is no "one size fits all" answer.
If the motor is operating within the voltage tolerance range it should saturate.
If the motor is running totally unloaded, yes I know not likely, higher voltage usually means higher no load current.
At rated load it would depend on the motor characteristics.

Did you mean "should not saturate"?

Sent from my XT1080 using Tapatalk
 
When I asked similar questions years ago, what I have been told, by more than one person directly involved in the motor design business, is that the "slash ratings" for 208/230V motors are based on a compromise. They base the design on a V/Hz ratio that equates to 220V 60Hz, then tweak the magnetic permeability factors of the core to allow for a higher voltage range without significant saturation. So in essence it is 220V +15%, -10%, giving it an acceptable range of 198V on the 208V side, 258V on the 230V side. They do this knowing that UTILITY voltage can only range +-5%, so a 208V distribution voltage is not supposed to dip below 198V, and a 240V distribution is not supposed to rise above 252V.

This range is, however, still somewhat compromised in that a TRUE 208V designed motor would be designed for 200V utilization, -10%, so allowing for a voltage drop over distance down to 180V at the motor terminals. You will NOT get that kind of performance from a "slash rated" motor, but they leave that up to you to deal with by supposedly de-rating the motor a bit if you have that extreme situation. They get away with it because they know that MOST installations that use 208V are going to involve commercial or light industrial motor applications, where the motors are typically NOT going to be fully loaded, so the difference is moot.

In your example though, your motor mfr is a bit remiss in not showing that the motor FLC will be 10% higher if connected to a 208V system. Most responsible motor mfrs will show a slash rating on the FLC if they show one on the voltage.
 
Very informative reply, Jraef...thanks. :)

Just to expand on the OP a little...from another forum, an existing OEM motor with the data plate ratings, was replaced due to an HO complaint of intermittently "not running". The service guy amped it at 1.3, and concluded it was over-amping, and causing the intermittent issue.

However, on replacing it with an identical OEM motor, the amperage was still 1.3. So, the service guy is (no doubt) concerned the initial diagnosis was in error, and looking for explanations. I thought maybe, operating at the high end of the acceptable voltage range might explain the slightly higher amp draw.

Thanks again...
 
Last edited:
Very informative reply, Jraef...thanks. :)

Just to expand on the OP a little...from another forum, an existing OEM motor with the data plate ratings, was replaced due to an HO complaint of intermittently "not running". The service guy amped it at 1.3, and concluded it was over-amping, and causing the intermittent issue.

However, on replacing it with an identical OEM motor, the amperage was still 1.3. So, the service guy is (no doubt) concerned the initial diagnosis was in error, and looking for explanations. I thought maybe, operating at the high end of the acceptable voltage range might explain the slightly higher amp draw.

Thanks again...
Let's not let it that one slip by!
Since the motor, whether compressor or blower, is likely operating in a constant power mode, the current at full load power will be higher if the supply voltage is lower. Even the unloaded current may be higher at lower voltage, although lower is common.
One common source of excessive current draw compared to the design value (that is, assuming that the equipment manufacturer did not intend to use the motor beyond its rating) is the fitting of the wrong size pulley on a blower motor. That can increase the load beyond the design value and the current will go up accordingly.
One way this can happen is if the equipment is designed or initially supplied for 50Hz and the pulleys sized accordingly and then the motor is run on 60Hz instead, increasing the blower RPM.

As for the symptom of "not running", it is not particularly likely that the branch circuit breaker is cutting out based on the difference between 1.2A and 1.3A. But the motor overloads may be cutting out under the same conditions if set tightly. That also points to the motor being mechanically overloaded for some reason.

PS: Oh! When you say FLA, that is taken from the rating plate on the motor?
Is there a Service Factor (SF) number there too?
 
Last edited:
Let's not let it that one slip by!
Since the motor, whether compressor or blower, is likely operating in a constant power mode, the current at full load power will be higher if the supply voltage is lower. Even the unloaded current may be higher at lower voltage, although lower is common.
One common source of excessive current draw compared to the design value (that is, assuming that the equipment manufacturer did not intend to use the motor beyond its rating) is the fitting of the wrong size pulley on a blower motor. That can increase the load beyond the design value and the current will go up accordingly.
One way this can happen is if the equipment is designed or initially supplied for 50Hz and the pulleys sized accordingly and then the motor is run on 60Hz instead, increasing the blower RPM.

As for the symptom of "not running", it is not particularly likely that the branch circuit breaker is cutting out based on the difference between 1.2A and 1.3A. But the motor overloads may be cutting out under the same conditions if set tightly. That also points to the motor being mechanically overloaded for some reason.

PS: Oh! When you say FLA, that is taken from the rating plate on the motor?
Is there a Service Factor (SF) number there too?

Thanks for your input...Motor application is a direct drive condenser fan, 60 Hz, and voltages/FLA taken from the motor data plate. No mention of a SF, though most I see in the residential arena don't have any design "margins", leastways not noted on the plate.

My question was tangential to what is, or was, actually going on with the service call. I think the longer term result of the "fix" will determine whether or not the original motor was the problem, or something external. :)
 
Status
Not open for further replies.
Top