Motor Controller Fundamentals
Motor Controller Fundamentals
(OP)
Hi everyone, I have what may be a really dumb question. A dc motor controller controls the speed of the motor by adjusting the output voltage. This voltage is usually linearly related to the speed of the motor. The needed current is also supplied to the motor through the controller (assuming the maximum is not met). So heres where I get confused. If I am using a 24vdc system with a motor that runs at 2400 rpm @ 24v, thats 100rpm/v. The 24v power source is made up of 2 12v batteries in series. So, if I want to run the motor at 1200 rpm, 12V is sent to it. Now, lets say that motor requires 50 amps at the given load. That equals 600 watts of power. If I assume 100% efficiency of the system, what is being drawn from the batteries? 12V and 25 amps each (600W), 6V and 50 amps each (600W), or 12V and 50 amps each (1200W)?
That leads me to another question. If I am drawing less than the max voltage at the battery, how does that affect its power/reserve rating? Can I "back calculate" a watt/hr rating from the amp/hr rating and adjust for different voltages? This may be an unnecessary question because I *think* the voltage remains constant and the current adjusts to supply the proper power, but I would like to verify that.
Another issue is I assume 100% efficiency in my example. I am sure that is not the case and I am pretty sure the controller and battery efficiencies fluctuate with output voltage and current. This is probably component specific, but is it typical to see very large fluctuations?
I hope that makes sense. Thanks for the help!
That leads me to another question. If I am drawing less than the max voltage at the battery, how does that affect its power/reserve rating? Can I "back calculate" a watt/hr rating from the amp/hr rating and adjust for different voltages? This may be an unnecessary question because I *think* the voltage remains constant and the current adjusts to supply the proper power, but I would like to verify that.
Another issue is I assume 100% efficiency in my example. I am sure that is not the case and I am pretty sure the controller and battery efficiencies fluctuate with output voltage and current. This is probably component specific, but is it typical to see very large fluctuations?
I hope that makes sense. Thanks for the help!





RE: Motor Controller Fundamentals
Assuming perfect conversion (i.e. no losses in the controller, in reality maybe 75-95% efficiency) the current through your batteries at 12V would be about 25A rather than 50, since in the DC case P=VI, and if there is no power loss in the converter then P(24V) = P(12V) which means that since the voltage was cut in half going from 24V to 12V the current must double, which makes the battery currents 25A. Since the batteries are in series, the same current flows through both, and assuming they are identical batteries (again P=VI) then the Power supplied by each would be 12*25=300W per battery. Of course this is the ideal case. In reality there are losses due to battery internal resistance, conversion from 24V to 12V, line resistance, etc.
The battery voltage should remain relatively constant (decreasing as it discharges). The current supplied by the batteries is the variable quantity. How long the battery can sustain a given current is a property of the battery chemistry, size, etc. Typically, though, an A/Hr rating is given for a certain discharge rate, and the battery will exhibit different A/Hr ratings for different currents. Typically the A/Hr rating goes down significantly as current is increased beyond the current used to compute the A/Hr rating.
Typically controllers and batteries both become less efficient as more current is demanded of them. With the controller you may notice probably <10% change in efficiency taking the current from the specified min to the specified max, but the battery efficiency drop may be quite dramatic depending on the battery chemistry.
Hope that helps you out.
RE: Motor Controller Fundamentals
I agree with MrBananas.
The current in both batteries is the very same since they are series connected.
The batteries voltage is almost constant into the load capacity range or battery life ampere-hours.
The controller or drive changes the voltage output to the motor not the power input voltage.
See sketch below.
RE: Motor Controller Fundamentals
motor speed vs. voltage linear range is limited.
Motor windings are optimized for certain voltage range,
if you pump too much current, you'll saturate the winding.
This happens when you try to run a high voltage motor with a low voltage supply. Need to look at the speed torque (current) curves.
RE: Motor Controller Fundamentals
Consider this. A pwm converter running from 24V will be supplying either 24V or 0V. It's a switch that is either on (24V) or off (0V). Running at 12V output it's supplying 24V for 50% of the time. To get 50A average current requires 100A at a 50% duty cycle. So, you have 24V x 100A x 50% = 1200W.
But, depending on the converter and motor you may also be measuring back-emf current flowing through a flyback diode during the pwm off time. Now, you really are getting 50A flowing continually so during the pwm on-time the current is 50A. So, 24V x 50A x 50% = 600W.
So, I guess I'm saying you would really have to measure the switch duty cycle and current flow to know. Or, you need to measure the current on the battery side of the converter. You really can't rely on the motor side current and voltage.
RE: Motor Controller Fundamentals
RE: Motor Controller Fundamentals
1 hour rate = 100 amps
4 hour rate = 25 amps.
20 hour rat = 5 amps
Total power @ 20 Hr rate > power @ 4 Hr > powr @ 1 Hr.