current output on a VFD

Status
Not open for further replies.

mull982

Senior Member
I have been trying to learn more about Varialbe Frequency Drives and had a question regarding how the output of the drive functioned. From what I have read it appears that as the drive changes the speed of the motor by adjusting the frequency, the drive will also adjust the voltage to maintain the appropriate voltz/hertz ratio, and constant torque. For example if 480V motor is operating at 60hz and is then slowed down to a frequency of 30hz the drive will compensate by proportionally dropping the output voltage to 240V.

The question I had regarding all of this was how the output current and torque responded to these changes. I would assume that as you are changing the frequency and voltage output to the motr both your impedance values (2*pi*f*L) and voltage are changing so you would expect to see a change in output current? Or would the current not change because of the fact that the voltage and frequency are changing proportionally such as in the example I gave above?

What happens to the output torque when the freuquency of the drive is changing? In my readings I have seen a bunch about "constant torque" vs "variable torque". As you change the frequency output of a drive from 60hz to 30hz as in above, what happens to the output torque value to the motor? Does it change or stay the same? If I understand correctly torque is a function of power and speed (hp/rpm) so I would guess that the output torque would be changing?

Thanks for all the help
 
mull982 said:
I have been trying to learn more about Varialbe Frequency Drives and had a question regarding how the output of the drive functioned. From what I have read it appears that as the drive changes the speed of the motor by adjusting the frequency, the drive will also adjust the voltage to maintain the appropriate voltz/hertz ratio, and constant torque. For example if 480V motor is operating at 60hz and is then slowed down to a frequency of 30hz the drive will compensate by proportionally dropping the output voltage to 240V.

The question I had regarding all of this was how the output current and torque responded to these changes. I would assume that as you are changing the frequency and voltage output to the motr both your impedance values (2*pi*f*L) and voltage are changing so you would expect to see a change in output current? Or would the current not change because of the fact that the voltage and frequency are changing proportionally such as in the example I gave above?

What happens to the output torque when the freuquency of the drive is changing? In my readings I have seen a bunch about "constant torque" vs "variable torque". As you change the frequency output of a drive from 60hz to 30hz as in above, what happens to the output torque value to the motor? Does it change or stay the same? If I understand correctly torque is a function of power and speed (hp/rpm) so I would guess that the output torque would be changing?

Thanks for all the help

The current will decrease with the frequency and voltage on variable torque drives. These applications consist of the majority of use such as centrifugal pumps, fans and agitators.

The current will remain relatively constant on constant torque drives. These applications can be found on conveyor belts, punch presses or positive displacement pumps.
 

winnie

Senior Member
Location
Springfield, MA, USA
Occupation
Electric motor research
You need to distinguish between two things: the torque that the motor is supplying to the load, and the torque that the motor is _capable_ of suppling to a load. As an analogy, think about the current that a transformer is supplying to operate a particular load, versus the current that the transformer is capable of supplying without overheating.

The reason that it is important to distinguish between these two is that the current drawn by the motor will depend on the torque that the motor is supplying. An induction motor at no load will spin at full speed, but draw much less current than that same motor at full load. (25%-50%, and almost all reactive, meaning very poor power factor and very little real power being used.)

For a VFD fed induction motor, a very rough approximation is that the current required depends upon the torque output. In other words, to produce full torque at half speed requires the same current as producing full torque at full speed. As you note, to operate at half speed, the voltage is reduced by half; so you have full current at half voltage (or half power) going in, and full torque at half speed (or half power) coming out.

weressl notes different types of load which characteristically have torque that reduces as speed goes down, or other loads where the torque stays constant. If the load torque goes down when the speed goes down, then the current drawn by the motor will go down when the speed goes down.

Finally, just to confuse the issue, the drive system itself will be described in terms of its maximum capability at any given speed. A VFD can supply frequencies both lower and higher than normal. At reduced speed, there is sufficient voltage available to maintain proper V/Hz ratio, so the motor is fully magnetized. The output torque of the motor is thus limited by the output current capability of the inverter. The speed range where the VFD is current limited is called the 'constant torque' range.

At increased frequency, the inverter is not capable of maintaining proper V/Hz ratio, and the output torque capability goes down. Over some frequency range the torque capability will vary in inverse proportion to speed; since the produce of torque and speed is power, this is known as the 'constant power' range. Both the constant torque and power ranges, as used to describe a drive system (VFD and motor), are telling you the _maximum_ torque that the system can supply, not the torque actually used by the load.

-Jon
 
winnie said:
You need to distinguish between two things: the torque that the motor is supplying to the load, and the torque that the motor is _capable_ of suppling to a load. As an analogy, think about the current that a transformer is supplying to operate a particular load, versus the current that the transformer is capable of supplying without overheating.

The reason that it is important to distinguish between these two is that the current drawn by the motor will depend on the torque that the motor is supplying. An induction motor at no load will spin at full speed, but draw much less current than that same motor at full load. (25%-50%, and almost all reactive, meaning very poor power factor and very little real power being used.)

For a VFD fed induction motor, a very rough approximation is that the current required depends upon the torque output. In other words, to produce full torque at half speed requires the same current as producing full torque at full speed. As you note, to operate at half speed, the voltage is reduced by half; so you have full current at half voltage (or half power) going in, and full torque at half speed (or half power) coming out.

weressl notes different types of load which characteristically have torque that reduces as speed goes down, or other loads where the torque stays constant. If the load torque goes down when the speed goes down, then the current drawn by the motor will go down when the speed goes down.

Finally, just to confuse the issue, the drive system itself will be described in terms of its maximum capability at any given speed. A VFD can supply frequencies both lower and higher than normal. At reduced speed, there is sufficient voltage available to maintain proper V/Hz ratio, so the motor is fully magnetized. The output torque of the motor is thus limited by the output current capability of the inverter. The speed range where the VFD is current limited is called the 'constant torque' range.

At increased frequency, the inverter is not capable of maintaining proper V/Hz ratio, and the output torque capability goes down. Over some frequency range the torque capability will vary in inverse proportion to speed; since the produce of torque and speed is power, this is known as the 'constant power' range. Both the constant torque and power ranges, as used to describe a drive system (VFD and motor), are telling you the _maximum_ torque that the system can supply, not the torque actually used by the load.

-Jon

Two nuances:

The motor at no load will run FASTER than its full speed - as per the nameplate - but still less than synchronous speed.

The torgue and current is not EXACTLY an overlapping linear realtionship along the full speed range and the torques is not exactly the same at 0 speed and full speed either although the difference can be ignored in practical terms.
 

masterelect1

Senior Member
Location
Baltimore
Torque parameters

Torque parameters

Most of the drives I have dealt with (none recently) have a torque parameter which can be set to compensate for loss of torque at lower freqs.

This parameter is particularly helpful when a slow ramp up time (similar to a soft start) is selected.

I am not sure if all drives have these features but have found multiple torque/ramp/freq ranges on TB Woods, Mitsubishi, & A-B.
 

Jraef

Moderator, OTD
Staff member
Location
San Francisco Bay Area, CA, USA
Occupation
Electrical Engineer
But to specifically answer your question, the torque remains the same in a Constant Torque application. The HP is changing with speed however. So in your example, at 230V and 30Hz, you motor is outputting the same torque as it would at 460/60, but because the speed is half, so therefore is the shaft HP. The current however will follow the torque, so it will stay essentially the same (leaving losses out of it for the moment). Think of it this way just using the NEC motor FLA charts; If I have a 10HP 460V motor, the FLA is 14A. If I turn the speed to 50%, it is now a 5HP 230V motor. What's the FLA of a 5HP 230V motor? 15.2A. The charts are dealing with other issues, but the idea is that the current will remain roughly the same. The POWER has dropped, i.e. the kW (which is the HP) so the energy use (kWh) is going to be less, but of course, that's because you are doing less work.

In a "Variable Torque" application such as a pump or fan, the issue of a VT drive being different is because they know in advance that the load application, that centrifugal pump or fan, is going to need even LESS power as the speed is reduced, because essentially the load on the motor begins to uncouple. Think about a pump turning very very slowly. Is it still moving a lot of fluid? No, because at some point the impeller just rotates without inducing flow. So as the speed is reduced, the V/Hz pattern can be tweaked in order to reduce the HP output even faster than just the normal linear relationship, i.e. they reduce it by the V/Hz^2 (squared). That way, when the speed is reduced, the HP is reduced even faster, so more energy is saved. Because of that, it will draw LESS current, and because of that, the current carrying components, i.e. the transistors, can be sized lower. That is why you will see that a CT drive of say, 100HP can drive a VT load of 125HP; because the components will be under less thermal stress as the speed is reduced.
 
Last edited:
Status
Not open for further replies.
Top