Triode
Well-Known Member
I have a lot of questions so I hope it's ok to make a thread to keep them in as I go through this design.
So I'm trying to make a driver for these motors
skysedge.com
The maker of them has a recommended driver but it's out of stock and will be for a year (I contacted her). But I am also doing this so we have our own design to customize and build upon.
First, a data dump
Motor info:
Power supply:
26.4 to 22V (range over discharge)
up to 25A per motor
Mechanical:
Desired speed: 0.2 - 2.5 m/s
Equivalent motor speed: 0.4 - 4.7 rev/s, 22-280 RPM, 340 - 4200 eRPM
Mass: fully loaded will be up to 64 kg, 140 lbs.
We will need speed control, and position reporting, but those are not too hard to add if the driver doesn't already include them.
Design weights:
In general for component selection I am more concerned with robustness and ease of getting components, and then ease of construction over minimizing the cost of components or making it ultra compact. I am in Washington, USA, which I mention because it relates to availability of parts.
Recommended driver/controller info:
Current 1A continuous, 3A peak (seems a bit weak, I've seen these motors take 6A on a different driver)
7-36V
25kHz pure sine commutation (I am not insisting on this switching speed, if anyone has a reason it should be different let me know)
-----
A few questions (if these seem too noob, I'm a firmware expert, BLDC control is something I have a little experience in but I'm not an expert):
the NZ1 uses an L6234 integrated driver. Is there a reason I shouldn't use a high/low side FET driver paired with discreet MOSFETs? Given the L6234 is out of stock everywhere I can find, what would be a suitable substitute?
For MOSFET selection, how much current should I try to rate them for. As I understand it, given an expected max current, I would add a bit of overhead (I've measured them at up to 8A so maybe aim for 10-14A) and then find the lowest gate charge I can that meets that spec.
If I use a high/low side driver IC, is there any reason not to get one with more current than I need? I know that with MOSFET s you don't wan to just max out the current, not just for cost, but because it generally comes with a higher gate charge, which means lower switching speed, which means more losses.
I know a gate resistor is needed to protect the FETs, prevent ringing, and reduce switching noise, if the firmware switches at 25kHz, should I pick the resistor best suited for exactly 25kHz, or something a bit faster? What percentage of time spent switching is ideal?
That is to say, I know if I have a given gate charge and I want 1% switching time, and a 50nC gate charge FET, x2 for rise and fall time (assuming they're symmetric)
(50 nC * 100 * 2) * 25 kHz = 0.25 amps
Now I think that since the FET driver is actually not constant, it's more like double that switching time, so maybe 0.5 amps (the average of a linear falloff to zero is 1/2 the peak) So do I just spec the resistor for 0.5A given the driver voltage (10-20V in most cases) or would best practice be to look at the limit for noise/current/ringing and make the resistance as low as those allow?
---
I know this is a bit long, and I'm sure I have lots of missed assumptions and wrong understandings, corrections or answers to my questions are much appreciated. I will keep any follow up questions here to avoid a clutter of threads, and share the designs and firmware (for the driver, not the robot) as I go.
I have most of the general lab equipment you'd expect, a scope, current probes, programmable load, power supply, function generator, so if any measurements would help with an answer let me know.
Thanks!
So I'm trying to make a driver for these motors
RoboWheel Hub Motor for Robotics
Large low-cogging hub motor with integrated encoder and mounting flange for robots
The maker of them has a recommended driver but it's out of stock and will be for a year (I contacted her). But I am also doing this so we have our own design to customize and build upon.
First, a data dump
Motor info:
- Approximate diameter of rolling surface: 170mm
- No gears; completely silent
- 27 poles and 1.7Ω winding resistance for low cogging (its actually 15 pole pairs, its an out runner motor that contains 27 coils and 30 magnets on the outside)
- Integrated 3200 PPR quadrature encoder (no index, just AB, and Hall sensors)
- Operating voltage: 7-42V
- Recommended controller: Pair with the NearZero to command with a few lines of Python on a Raspberry Pi or other common SBC, or use the PWM outputs of an R/C receiver
Power supply:
26.4 to 22V (range over discharge)
up to 25A per motor
Mechanical:
Desired speed: 0.2 - 2.5 m/s
Equivalent motor speed: 0.4 - 4.7 rev/s, 22-280 RPM, 340 - 4200 eRPM
Mass: fully loaded will be up to 64 kg, 140 lbs.
We will need speed control, and position reporting, but those are not too hard to add if the driver doesn't already include them.
Design weights:
In general for component selection I am more concerned with robustness and ease of getting components, and then ease of construction over minimizing the cost of components or making it ultra compact. I am in Washington, USA, which I mention because it relates to availability of parts.
Recommended driver/controller info:
Current 1A continuous, 3A peak (seems a bit weak, I've seen these motors take 6A on a different driver)
7-36V
25kHz pure sine commutation (I am not insisting on this switching speed, if anyone has a reason it should be different let me know)
-----
A few questions (if these seem too noob, I'm a firmware expert, BLDC control is something I have a little experience in but I'm not an expert):
the NZ1 uses an L6234 integrated driver. Is there a reason I shouldn't use a high/low side FET driver paired with discreet MOSFETs? Given the L6234 is out of stock everywhere I can find, what would be a suitable substitute?
For MOSFET selection, how much current should I try to rate them for. As I understand it, given an expected max current, I would add a bit of overhead (I've measured them at up to 8A so maybe aim for 10-14A) and then find the lowest gate charge I can that meets that spec.
If I use a high/low side driver IC, is there any reason not to get one with more current than I need? I know that with MOSFET s you don't wan to just max out the current, not just for cost, but because it generally comes with a higher gate charge, which means lower switching speed, which means more losses.
I know a gate resistor is needed to protect the FETs, prevent ringing, and reduce switching noise, if the firmware switches at 25kHz, should I pick the resistor best suited for exactly 25kHz, or something a bit faster? What percentage of time spent switching is ideal?
That is to say, I know if I have a given gate charge and I want 1% switching time, and a 50nC gate charge FET, x2 for rise and fall time (assuming they're symmetric)
(50 nC * 100 * 2) * 25 kHz = 0.25 amps
Now I think that since the FET driver is actually not constant, it's more like double that switching time, so maybe 0.5 amps (the average of a linear falloff to zero is 1/2 the peak) So do I just spec the resistor for 0.5A given the driver voltage (10-20V in most cases) or would best practice be to look at the limit for noise/current/ringing and make the resistance as low as those allow?
---
I know this is a bit long, and I'm sure I have lots of missed assumptions and wrong understandings, corrections or answers to my questions are much appreciated. I will keep any follow up questions here to avoid a clutter of threads, and share the designs and firmware (for the driver, not the robot) as I go.
I have most of the general lab equipment you'd expect, a scope, current probes, programmable load, power supply, function generator, so if any measurements would help with an answer let me know.
Thanks!
Last edited: