rockman7892
Electrical
- Apr 7, 2008
- 1,176
I am trying to determine why induction motors tend to draw higher current at lower speeds?
For example with a VFD I have heard that below 10% that you may need to have a V/HZ boost in order to boost the torque and reduce the amount of current.
Also I have noticed with several drives that througout the range of the drive the current stayed pretty much the same, however when the motor was run below 10% the current seemed to increase dramatically. For example when stopping a motor from 60hz to 0hz through a ramp time, I notice that the current stays the same all the way to about 10hz when it increases for the range of 10HZ down to 0Hz before it is cut off to 0
Does the drive have a hard time producing enough flux at this lower range, and therefore must use current to compensate for producing required torque?
For example with a VFD I have heard that below 10% that you may need to have a V/HZ boost in order to boost the torque and reduce the amount of current.
Also I have noticed with several drives that througout the range of the drive the current stayed pretty much the same, however when the motor was run below 10% the current seemed to increase dramatically. For example when stopping a motor from 60hz to 0hz through a ramp time, I notice that the current stays the same all the way to about 10hz when it increases for the range of 10HZ down to 0Hz before it is cut off to 0
Does the drive have a hard time producing enough flux at this lower range, and therefore must use current to compensate for producing required torque?