Can I run a 230 motor at 250 volts

Status
Not open for further replies.

scott657

Member
I have a drill press at work that is labeled at 230 volts. I have a single phase 480 to 240
2kv transformer hooked up. Im getting about 260 volts on the secondaries when not running and about 250 volts when running but no load, am I looking at trouble?
 
Scott,
That is an over-voltage.
The question is what is the effect?
Do you like to experiment?
Is usage very short-term?
Is this in an air-conditioned room?
Will running under a physical load bring down the voltage?
 
Does the transformer secondary have taps?

Does the transformer secondary have taps?

If the transformer has taps, adjust them to correct the over-voltage. Caution, verify the input voltage and make sure its correct. Your slightly adjusting the ratio of the transformer. NOTE: most small transformers most likely not have taps.

I might like to know the HP rating of the motor the transformer is expected to run. That would help answering the question of your voltage drop.
 
I have a drill press at work that is labeled at 230 volts.

Im getting about 260 volts on the secondaries when not running and about 250 volts when running but no load, am I looking at trouble?
This sounds almost ideal to me. The 230v motor is expected to be supplied by a 240v supply.

The 250v unloaded voltage is not too high, in my opinion; the motor may even run cooler.
 
Aren't most motors built with a voltage tolerance of +/- 10%?

I'd hook it up myself and not worry about it.
 
I have a drill press at work that is labeled at 230 volts. I have a single phase 480 to 240
2kv transformer hooked up. Im getting about 260 volts on the secondaries when not running and about 250 volts when running but no load, am I looking at trouble?
Have you checked the frequency given on the motor nameplate? 230V is typical of UK and European equipment and that mostly uses 50Hz.
 
I am in complete agreement with Cow. Motors are usually rated to run at a voltage +10% or -10% of nameplate. AND...its better to be +10% over the nameplate than under 10%
 
The problem with low voltage, as some have pointed out, is that the motor will have a higher current, and will run hotter. The problem with high voltage is that the motor will have a lower current, and will therefore not be able to develop as much torque. I should think that torque is a critical parameter for a machine like a drill press.

That said, however, if the stated tolerences were not enough to allow the machine to function properly, then the manufacturer would have stated other tolerences. If the motor nameplate has a range of acceptable voltages, then that will give you your answer. If it does not, and if there is no other available source for a range of acceptable voltages, then I would go with plus or minus ten percent. You are within that range, when the motor is running. Someone else will have to tell you, as I do not know, whether the range of acceptable voltages is intended to mean the no-load value, or the full-load value.
 
Have you checked the frequency given on the motor nameplate? 230V is typical of UK and European equipment and that mostly uses 50Hz.
Most motor nameplates, in the US, are rated for a nominal utilization voltage and not for the nominal supply voltage. The utilization voltages are 95.83% of the supply:
115(120)
200(208)
230(240)
460(480)
575(600)
I can not see any valid reason, other than tradition, for having two different nominal voltages.
 
Charlie,

I disagree. Increasing the voltage supplied to an induction motor does not decrease the available torque.

At constant magnetic field strength in the motor, current is proportional to torque. When you raise the applied voltage you raise the magnetic field strength, reducing the current required to produce the same torque. So the reduced current that accompanies a small increase in voltage does not mean reduced torque.

The breakdown torque of a motor scales with the square of the applied voltage. So the _available_ torque will actually increase as the voltage is increased.

-Jon
 
Most motor nameplates, in the US, are rated for a nominal utilization voltage and not for the nominal supply voltage. The utilization voltages are 95.83% of the supply:
115(120)
200(208)
230(240)
460(480)
575(600)
I can not see any valid reason, other than tradition, for having two different nominal voltages.
Nor can I.
Your post illustrates a significant difference between UK and USA distribution systems.
For us Brits, yours seems overly complex. For LV we have just 400V 3-phase, 230V phase to neutral.
No mixed voltages, no high leg stuff.
Everything in my house is 230V. Lights, television, washing machine, hob, oven, security lighting....stuff. A lost neutral won't overvolt anything. That's how it is for domestic here.
It just seems simpler.
 
Nor can I.

I don't think you guys are talking about the same thing.:smile:

I took Jim's comment

I can not see any valid reason, other than tradition, for having two different nominal voltages.

To mean there is no good reason to have both nominal supply voltage and nominal utilization voltage ratings.

ie: 115 and 120 VAC etc.
 
When you raise the applied voltage you raise the magnetic field strength, reducing the current required to produce the same torque. So the reduced current that accompanies a small increase in voltage does not mean reduced torque.
That seems counter-intuitive. The magnetic field is created by the presence of current, and the only job the voltage does is to cause the current to flow. So why would magnetic field strength have a direct relationship with voltage? :confused:

p.s. My wife says things all the time that (1) seem counter-intuitive to me, and (2) turn out to be right. Fortunately, I don't make my living by using intuition to play the stock market! ;)
 
I don't think you guys are talking about the same thing.:smile:

I took Jim's comment



To mean there is no good reason to have both nominal supply voltage and nominal utilization voltage ratings.

ie: 115 and 120 VAC etc.
I agreed with that point.
 
The easiest way to think about this is to separate the 'generation of the magnetic field' from the 'generation of torque'. In many motors these are physically separate processes; for example consider a shunt wound DC motor, where you have a separate field coil that generates the magnetic field and the armature that carries the torque producing current. In such a motor, you actually have separate current flows that you can measure and adjust separately.

In an AC induction motor, the same coil carries a single current that performs both torque production and magnetic field production. Yet this composite current flow can be described in terms of its torque production and magnetic field production.

The torque producing current and the magnetizing current are 90 degrees out of phase; the torque producing current is what causes the _real_ power consumption of the motor, and the magnetizing current is what causes the _reactive_ power consumption of the motor. These two components of current show up as a single current flow at some phase angle relative to the applied voltage.

As you increase the voltage, the magnetizing current flow _increases_. This is simply an AC voltage applied across an inductive reactance; magnetizing current is 90 degrees out of phase with voltage, and proportional to voltage. As the magnetizing current increases, the magnetic field strength increases, and for a given fixed mechanical load the torque producing current drawn by the motor will go down.

The common rule of thumb that increasing voltage will result in reducing current flow into a motor is really only true about the normal operating point of the motor. At full load and nominal voltage, the torque producing current is significantly larger than the magnetizing current. Increasing the applied voltage will _increase_ the magnetizing current, decrease the torque producing current, and result in a net decrease in composite current flow.

But if you increase the voltage to the motor enough, I can guarantee that the total current flowing will increase.

-Jon
 
The reason for the "Utilization voltage" and "Distribution (Nominal) voltage" difference is twofold:
  1. It allows for voltage drop at the utilization point. By NEMA specifying that a motor design voltage is 460V and the distribution is 480V, it allows for the motor to still operate at spec when there is a long distance between the transformer and the motor.
  2. The US is a big place and it does not have a monolithic power utility system. It is made up of a myriad of small and large utilities, many of which generate and distribute slightly different voltages than their next door neighbors. Although there has been a lot of progress towards unification of standards by specifying new installations to conform, there are a lot of legacy systems still out there and the cost to force everyone to change is prohibitive. So in some places, you have 110/220V, others 115/230V, others 120/240V and still others 125/250VAC in residences. The Utilization voltages are often a compromise. If I rate a motor at 230V, +- 10%, it can work on a 220V system as well as on a 250V system. The same is generally true for 3 phase commercial / industrial systems; 440-460-480 or 208-230-240 (although it can get more complicated by transformer configurations as well).
 
There is no reason, other than NEMA standards and tradition, why a manufacturer could not design their motor for a voltage range yet still provide a nameplate that showed the nominal supply voltages that have been in use in the US for almost 50yrs.

Is it really too confusing to say a motor is rated for 240V +5% -15%?
 
Charlie,

I disagree. Increasing the voltage supplied to an induction motor does not decrease the available torque.

At constant magnetic field strength in the motor, current is proportional to torque. When you raise the applied voltage you raise the magnetic field strength, reducing the current required to produce the same torque. So the reduced current that accompanies a small increase in voltage does not mean reduced torque.

The breakdown torque of a motor scales with the square of the applied voltage. So the _available_ torque will actually increase as the voltage is increased.

-Jon

i tend to agree... if the work is constant, then the power consumed will be
constant, and if you raise the voltage, within the motors ability to accept,
that is, you'll lower the motors running current. heat being a function of
current flow, the motors running temperature should go down.


randy
 
Status
Not open for further replies.
Top