This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

DRV8312 getting hot

Other Parts Discussed in Thread: DRV8312, INSTASPIN-BLDC, DRV8332, DRV8302, DRV8301, MOTORWARE, CSD18533Q5A

Please excuse what is probably a complete newbie question but I'm having another problem with my DRV8312-69M-KIT. I'm trying to run the BDLC_Sensorless project with a small outrunner motor. I'm at build level 5 so its just running at a fixed duty cycle with closed loop commutation (zero crossing bemf). Even at a duty cycle of 0.1 to 0.2 the DRV8312 gets really quite hot. Its only drawing 0.5A to 1.0A from the 14V power supply. It would appear that most of this is being dissipated in the DRV8312 rather than the motor. Any idea what's going on?

  • 1. you're running BLDC sensorless on the InstaSPIN-FOC and -MOTION kit? Any reason why?

    2. I believe you have a damaged DRV8312 EVM.  Are any of the LEDs on RED or Flickering?  With a normal board you should have steady green on 12V, 5V, and PVDD.  These have been very robust for over 2 years of production, but last week at a workshop one of the students had a fresh kit that clearly had a DRV8312 issue. Not sure if it was poor placement/solder or a just a bad chip.  In his case the PVDD was flickering and the chip got very hot very quickly.  You may want to return.

    Does the chip get hot when the board is powered (no control running at all)?  That would tell you there was damage.

     

     

     

  • 1. I got the 69M kit plus the Stellaris controlCARD so I could try InstaSPIN-BLDC and InstaSPIN-FOC/MOTION out of the box. I'm a software engineer but I have only limited experience of embedded systems so I wanted to make life easy for myself. It looks a lot like my Stellaris controlCARD is faulty so I've gone for the only BLDC project I could find that didn't need porting to the 69M. I also tried to get this motor running under FOC. I got it to spin and the values for Rs and Ls looked vaguely plausible but I don't think its commutating properly. It behaves similarly to if it were in open loop mode with a duty cycle that was too high for the rotation speed. It has a lot of cogging (its a 16 pole ABC wind rather than the 14 pole dLRK wind that I eventually want to use) and combined with the fact that the voltage scaling on the EVM doesn't match my motor means I decided to give up on it for the time being.

    2. I'm pretty sure the LEDs are ok. I'll check next time I run it up. The chip definitely doesn't get hot when its not driving the motor. I did notice that 14V seems to be only just enough for the onboard 12V regulator to work but I figured that as long as the LEDs came on then it probably wasn't a problem. I can supply the 12V externally if need be. If nothing else I'd be able to see how much current was being drawn from each supply. Do you happen to have figures for what the current draw should be at idle?

     

  • tell me everything about the motor that you know please and I'll try to help.

    The DRV8312 kit won't run from 12V PVdd...it really needs to be 15V minimum and I try not to run it below 18V. 

  • Can you clarify - is it just deriving the 12V power supply that won't work or is there some other problem? The datasheet for the DRV8312 suggests PVDD can be 0V to 50V. Either way its still getting hot running at 18V and all the lights are definitely steady green.

     

    The motor is the Turnigy 2730-1500. Its the same as the Hextronik 24g. 16 magnet poles, 12 stator teeth wound ABC. 1500 RPM/volt. I'm using 129 mohm and 19.1 uH and the flux linkage is coming out around 0.00645 Vs.

     

     

  • I reduced the flux linkage down to 0.005 Vs and increased the speed reference up to 500rpm and its now spinning quite nicely (albeit with a bit of a high pitched whine). Its like magic! Is there anything I can do at this point to tune Rs, Ls and flux linkage further or should I move onto the current and speed loops?

     

     

  • These hobby motors typcially don't like to run very slow, though with some of the larger RC motors we have made them crawl with proper sensing.

    While the DRV8312 can run at 0-50 the power supply that pushes the rest of the board can not. Keep it over 15V, and higher if you don't want your motor pulling it down.

    Once you build a power stage that fits your motor you will REALLY see InstaSPIN-FOC perform!

    If 19uH is accurate that's pretty low. I'm guessing you don't have a current probe, but you will certainly benefit from increasing the PWM frequency.  The higher you can go, the better.  30-60 KHz will probably be necessary.

    Your Rs value is good. Your Ls may still be off, though this is reasonable R/L and typical of what we see on these motors, so I'd move forward.  The flux value is calculated all the time on the fly, so it doesn't really matter if you gave it a new value.

  • Wait, which project are you running? BLDC_sensorless, InstaSPIN-BLDC, or InstaSPIN-FOC?

     

  • Okay, I think I understand - the problem is VR1.

     

    I've actually already increased the PWM to 30KHz.

     

    I'm not sure about my power stage now. I had planned on the DRV8332 but I'm wondering if its up to the job now and the 12V supply thing is a nuisance for battery operation. Maybe I'll wait and see what your DRV8302 + NexFET design looks like.

     

    I thought the flux value was calculated on the fly but for some reason I can't see it being updated at the moment. Maybe it just keeps calculating the same value.

     

    Something that is looking a bit odd at the moment is that the frequency of my voltage waveforms doesn't seem to match the speed in InstaSPIN - its out by a factor of 1.5. Probably I'm misreading something.

     

  • InstaSPIN-FOC Lab 3a.

  • The DRV8302+NexFETs are in two different boards.

    1. BoosterPack for an F28027 LaunchPad. 8-24V, 14A.  2 shunt (using DRV8302 2x PGAs) only so max duty cycle will be limited to 1.15 SVPWM

    2. 2MTR EVM, which we will package with F28069M controlCARD. I think the voltage bottom will be 8 V, maybe a bit less. Max of 24V with 14A. 3 shunts using external OPA with enhanced layout to improve upon the DRV8301 EVM board.  This REALLY shows off what InstaSPIN-FOC can do with 12-24V <14A motors!

     

  • Sam,

    Jumping in here a little late, but wanted to comment on the DRV8312 driver itself.  The WHOLE device can run on a single 12V rail.  You would just connect GVDD/VDD/PVDD together.  The limitation is that the gate drive requires a +/-10% tolerance on that rail for the device to operate properly.  

    The majority of the heat dissipation will be the current draw on PVDD.  You can isolate this on the EVM by providing a separate 12V rail just for the gate drive and other circuitry and a dedicated supply for PVDD.  The current in PVDD and dissipated in the output FETs will drive the heat you are seeing.  If you can measure your phase current, I would have a better idea of what is going on here.  Pdissipated = I^2 * Rdson.  The Rdson of the internal FETs is 110mohm typical.   

    As Chris mentioned, if DRV8312/8332 is not an option do to thermal constraints, than the DRV8302 will certainly do the job as the heat would be dissipated through external FETs.  

  • I'm not 100% sure yet but I think the very high dissipation in the driver (implied in the title of this topic) is specific to the BDLC_Sensorless project. I'm don't know if there's a bug in there somewhere or something. As I understand it the driver has internal shoot-through protection so presumably its not that...

     

    Either way, thermals are looking much better under FOC and there's a reasonable chance that the DRV8332 will be good enough. If its not then there is the DRV8302 as you say.

  • Sam,

    how did you increase your PWM to 30 KHz?

    If you just changed the PWM to 30 KHz and didn't change any decimation, your controller is missing interrupts and overflowing the control loop.

    We have a new user.h TICK decimation that makes it easier to take advantage of the 280x ePWM feature to set the ADC SOC every 2nd or 3rd event. This lets you decimate w/o any interrupt overhead or getting overlapped on your ADC samples.

    I'm trying to get this pushed into the next release (_07), or I can post the updated user.h and drv.c/.h files.

     

  • well, FOC certainly controls the current switching and ripple much better, so the Thermals will look better. As an example, a smal refrigerator compressor takes 9A peaks with BLDC but we can control to 3.5A with FOC.  We can almost get away with using a DRV8312 for a full load refrigerator compressor...so close!

    But you said it was getting hot without even commanding a speed though, right?

     

  • I increased the PWM frequency and set USER_NUM_CTRL_TICKS_PER_CURRENT_TICK to 2.

     

  • I was running it in duty cycle demand mode under BLDC_Sensorless. It stays cool with no demand and gets hotter with increasing duty cycle. But I only had to get to about 20% duty cycle for it to get really hot and it was still hardly drawing any power at this point (and it definitely hadn't reached full speed of the motor).

  • I am definitely struggling with the InstaSPIN speed now. If I demand 500 rpm then I get 100 hz on the scope. If I demand 1krpm then I get about 200 hz. If I demand 2krpm then I get about 425 hz. I even tried connecting the shafts of two identical motors and reading the back-emf off the second motor just to check I wasn't doing something dumb. If this motor has 16 magnet poles then it should be 500 rpm * 8 poles / 60 min/sec = 66 hz shouldn't it? I also changed the PWM rate back to 15kHz and that made no difference.

  • Sam,

    I'm going to make a standalone post with the latest files. I've incorporated the new PWM tick. I just tested 3a at 30 KHz PWM with /2 ADCSOC (inner control at 15 KHz) and all was good.

     

  • Okay, I'll try that tomorrow. Thanks.

     

  • I tried the new pre-release package. I copied my current motorware directory and pasted the new files into the copy. I then manually copied across my adc offsets and motor parameters from user.h. Now when I try to spin the motor the red FAULT led comes on immediately.

  • Sam,

    please copy over the entire \sw directory from my zip

    please in lab 3a set-up your user.h for your motor (note PWM, decimations, and your USER settings in the motor section) and try identifying your motor again. You will also want to double check the offsets as they may have changed with the new ADC driver.

    I ran 3a on the DRV8312 kit yesterday - going through ID again - and everything worked fine.

     

  • I still haven't managed to get it going. The full motor identification never worked with this motor and still doesn't but now Rs identification doesn't work either. Sometimes it comes up with huge numbers other times tiny numbers.

     

    I'm going to have to carefully compare my two versions to see what's happened. One thing I have noticed (although it shouldn't affect me) is that you've lost a } around line 948 in drv.c for the 8301 kit.

  • Found my problem. Somehow despite having the new drv.c in the 8301 kit directory I still had the old one in the 8312 kit directory. No idea how that happened.

    Having extracted everything again its now spinning the motor up. I still have to try the new PWM settings and I need to check whether the speed problem is fixed. But we're heading in the right direction.

    By the way, I stumbled on something about InstaSPIN quadcopter today. Its quite similar to my project in a lot of ways.

     

  • Woohoo! Glad that worked, you should be able to play with the PWM easily now.

     

    let me check on the 8301 drv file. I know which } you are talking about, but I thought I fixed it.

     

    Yes, the small quadcopter from TI is an internal project one of the apps guys is working on. We have a rotor controlled well, now just need to put 2 then 4 together. We'll be releasing everything to do witht the project open source so others can share.

  • thanks for pointing out the missing }, guess Beyond Compare missed this one as I was merging two different drv.c into the original drv.c.  This is now fixed - I checked the other drv.h as well -  and posted on the same thread as a Rev B.

     

  • Bad news I'm afraid. It did work at least once. But most times its not working. Its not quite the same fault as before - Rs identification is working now. But whenever it goes wrong the red fault light comes on and stays on. It never happens with the old project (even though the code is almost identical). And when its gone wrong the only way to get it working again is a complete power cycle and then running the old project.

  • what are your user.h settings? post the file.

    what parameters values does it ID? Does it finish ID?

     

  • 4452.user.h

     

    Its lab 3a so its not trying to id Ls but it gets Rs = 0.1261651 which is about right and the ADC offsets look ok. The flux is also in the right ball park (around 0.005) but bounces around quite a lot. The main problem is that the speed is bouncing around 0 regardless of the demand. It just can't seem to actually get the motor spinning.

     

    What is the meaning of LED6 being lit because I'm pretty sure that has something to do with it?

     

  • It will certainly ID Ls if you run motor identification. Unless you have loaded your values into user.h and are bypassing motor ID completely.

    LED6 is the fault for the third inverter leg. Are you sure the silver toggle switch is in the MIDDLE position? 

     

  • your ADC offsets look a little strange. double check them, even in lab 3a it will do the offset calculation again and you can view the adcBias results.

    your user.h has only 15KHz. I'd up this to 30 KHz immediately.

    regarding your user parameters,

    your R/L = 6.7KHz...what is the expected max speed you will run at? 

    Your RES/IND_CURRENT is at 2A...is this a 20A motor? (you only put 5A as the max). If not, reduce this to ~10% of current at rated torque.  I think this is probably too high and your Rs should likely ID lower.

    are you powering this at 24V bus?

     

     

  • I have loaded my parameters into user.h and I thought the whole point of lab 3a was that it bypassed Ls identification. gMotorVars.Flag_enableUserParams is 1 on startup for me. I'm not sure where this gets set. I just tried setting it to 0 before running the identification and it came up with 963uH for Ls which is wrong. I thought we weren't expecting Ls identification to work on motors with R/L > 2000?

     

    The silver toggle switches are all fine. The problem seems to be really intermittent. With gMotorVars.Flag_enableUserParams = 1 I just tried repeatedly flipping Flag_Run_Identify and the first 2 times led6 came on and on the 3rd time it stayed off and the motor spun up.

     

  • the point of lab3 is the ability to load from user.h and bypass the entire ID process.

    but you can run ID at any time w/ any of the labs by changing the flag.

    ok, it sounds like you have a current ripple at start-up that is causing the fault sometimes. this is because you are using a 3.5A continuous 6A peak driver for a motor that obviously can pull more than that, especially to start-up (even if it only draws 1A running). 

    You can lower your max current in user.h, this way the Iq controller won't request a current that can fault the system. Lower this to 3A max.  I assume you are using a standalone power supply that can supply this amperage?  The wall supply in the kit can only provide 2.5A continuous.

    You will also want to to increase the PWM, this helps with the current ripple, even on start-up.

    I also think your RES_CURRENT is still too high. If this pulls 1A unloaded then don't use more than 1A for RES_CURRENT.

    And if you enable RsRecalc before you start (which is default in lab3a) it should align your stator/rotor fields to minimize that current spike.

     

  • Yeah, I've been experimenting with the PWM. Once it actually gets going the higher PWM definitely does make an improvement but it seems to be unrelated to the current intermittent problem.

     

    I'm expecting the motor spin at about 15000 rpm (2khz) with no load and about 7500rpm (1khz) under load.

     

    Its a 10A motor. I can certainly turn the RES/IND_CURRENT down a but.

     

    The ADC offsets I posted are about right for I but are off for V. They're actually reading more like 0.187 (I posted 0.250). But as you say they are recalibrated in lab 3a so the values in user.h shouldn't matter.

     

    I'm powering it from an 18V PVDD.

     

  • I've done everything you suggested and it looks like its done the trick. I think the Ls identification might even be working now (its coming up with a value very close to my guess so I just need to make sure it is actually recalculating it).

     

    The really weird thing is that when its actually running properly I can get 2krpm out of it with it only drawing 300mA from the supply (which is only about 80mA more than when its idle). I'm amazed it was ever drawing enough to cause a driver fault even on startup. I'm using a proper bench PSU and there was no sign of it on the voltage or current readouts. I guess it must just be too slow to react (and I might have had the current limit set too low on it as a result). When you say "if it pulls 1A unloaded then don't use more than 1A for RES_CURRENT" does that mean "if it pulls 250mA unloaded then don't use more than 250mA for RES_CURRENT" too?

     

    Sorry I'm taking up so much of your time but I'm sure I'll have more questions later.

     

     

  • good, glad it's working. what kind of Rs and Ls values do you get now?

    there isn't a fixed rule about at what current to take an Rs measurement. In fact, Rs changes under load and you'll notice that we offer a run-time tracking of Rs called Rs Online for this occassions.  The thing is that Rs has the most effect at low speed and Ls has the most at high speed, so for your application it really won't matter that much in general.  But it's better not to overdrive your motor, especially if you are about to do an inductance test during motor ID.

    Ls is also not a constant. In fact, Ls is an average of Ls-d and Ls-q, and this can change a small amount or a great amount over load and speed.

    For your application it's most critical though just to get a good average value.

  • I spoke too soon about the Ls identification. The thing that made me think it was working was that it had changed every so slightly from my user.h value for some reason. But full identification still can't determine Ls (it normally gets something very small or zero). I'm going to try adding inductance to see if can get a measurement and then remove it a bit at a time.

     

    The problem converting Hz to RPM is still there. I'm going to check if its also there with the kit motor. I can't see why it wouldn't be.

     

  • Sam,

    we are running this motor for our quadcopter very well with DRV8312 kit

    http://www.hobbyking.com/hobbyking/store/__6312__hexTronik_5gram_Brushless_Outrunner_2000kv.html

    Because of the kit design we have to run at 16V minimum, so that's what we are doing, but for the actual quadcopter I think we'll be running at 7-8V.

    On the kit we changed out the Voltage sense resistors to scale off of 16.526V to improve our resolution.

    So you have to change in the user.h file

    #define USER_IQ_FULL_SCALE_VOLTAGE_V      (16.0)     

    #define USER_ADC_FULL_SCALE_VOLTAGE_V       (16.526

    And for this little motor we use 0.75V for RES_CURRENT, -0.5 for IND_CURRENT, and 2A MAX_CURRENT.

    We also let the motor spin up to 200-400 Hz for inductance estimation.

    I suggest you try something similar for your motor.

    BTW - I can get similar results to what you are seeing by using the higher currrent EFLITE 420 on the DRV8312 kit.  I'm going to take this over to the 16V scaled kit and see if it performs better, but it really needs the DRV8301 (rescaled to 16V) and a power supply that can generate up to 25A.

     

  • So you have Ls identification working on the 5g motor? What inductance does it have out of interest?

     

    Yep, I'll certainly look at making those changes and when the BoosterPack is released I'll start using that I think.

     

    When you say that you can get similar results does that include the Hz to RPM scaling issue? It looks to me like its an error in a conversion ratio somewhere and not a problem driving a particular motor.

     

  • Here are the USER_MOTOR params used for the 5gm motor, with the 16V hardware change.  Once running you have to lower the Speed PI controllers, and as you go faster lower them even further.  We ran this at 60 KHz PWM, 20 KHz control loop.

     

    #elif (USER_MOTOR == My_Motor)                       // Name must match the motor #define

    #define USER_MOTOR_TYPE                 MOTOR_Type_Pm       // Motor_Type_Pm (All Synchronous: BLDC, PMSM, SMPM, IPM) or Motor_Type_Induction (Asynchronous ACI)

    #define USER_MOTOR_NUM_POLE_PAIRS       (2)      //4           // PAIRS, not total poles. Used to calculate user RPM from rotor Hz only

    #define USER_MOTOR_Rr                   (NULL)              // Induction motors only, else NULL

    #define USER_MOTOR_Rs                   (0.8803203)       // Identified phase to neutral in a Y equivalent circuit (Ohms, float)

    #define USER_MOTOR_Ls_d                 (2.28808e-06)    // For PM, Identified average stator inductance  (Henry, float)

    #define USER_MOTOR_Ls_q                 (2.28808e-06)   // For PM, Identified average stator inductance  (Henry, float)

    #define USER_MOTOR_RATED_FLUX           (0.001959046)    // Identified TOTAL flux linkage between the rotor and the stator (Webers = Volts* Seconds)

    #define USER_MOTOR_MAGNETIZING_CURRENT  (NULL)              // Induction motors only, else NULL

    #define USER_MOTOR_RES_EST_CURRENT      (0.75)         // During Motor ID, maximum current (Amperes, float) used for Rs estimation, 10-20% rated current

    #define USER_MOTOR_IND_EST_CURRENT      (-0.25)          // During Motor ID, maximum current (negative Amperes, float) used for Ls estimation, use just enough to enable rotation

    #define USER_MOTOR_MAX_CURRENT          (1.0)            // CRITICAL: Used during ID and run-time, sets a limit on the maximum current command output of the provided Speed PI Controller to the Iq controller

    #define USER_MOTOR_FLUX_EST_FREQ_Hz     (200.0)

     

    don't forget when you change the voltage HW you need to update

    #define USER_IQ_FULL_SCALE_VOLTAGE_V  ()

    #define USER_ADC_FULL_SCALE_VOLTAGE_V       ()    

    #define USER_VOLTAGE_FILTER_POLE_Hz  ()

  • if you are seeing a mismatch between Hz and RPM then

    1) you have the wrong # of poles set in your user.h

    if the motor is running at clearly a diffeerent speed than the SpeedEst then the estimator is completely off, meaning your parameters are incorrect

     

  • If I look at ADC-Vhb1 (for instance) relative to GND on a scope and measure the frequency and convert this to krpm using the same values (i.e. multiply by 60 seconds and divide by 8 pole pairs) that are in user.h then I get a number that is 1.5 times higher than the value in InstaSPIN. Same if I measure the phase-to-phase voltage of an identical motor connected to the shaft. So I think this means the problem isn't the number of pole pairs because even if it were wrong I'd be using the same wrong number of pole pairs in my calculation.

     

    Maybe its that the estimator is completely off but it does appear to be commutating in sync with the motor. Would this be expected?

     

  • is this with the same user.h you posted earlier, 15 Khz with no decimation?

    RPM = Hz * 120 / poles

    so in your case RPM = Hz * 7.5  if you have 16 poles...does that little motor really have 8 pole pairs?

    if it does, and you command 750 RPM, and aren't seeing 100 Hz on the scope, then something is wrong.

     

  • 5516.user.h

     

    I'm now using 30Hz with decimation (see user.h).

     

    If I command 500 RPM I get 100 Hz on the scope. If I command 750 RPM I get 150 Hz on the scope.

     

    The motor has 16 magnets. It has 12 stator teeth with phase A wound on every 3rd tooth as I understand it. How many pole pairs is that?

     

    Is it possible my MCU is running at the wrong clock speed?

     

  • I agree that with a 500 RPM command, that gets formated to a 66.66 Hz command based on your pole setting of 16.

    So for the controller to be running stable at 100 Hz (50% higher) does not seem correct.

    Can you try running 24 KHz PWM with no decimation and see if you still get a mismatch in Hz?

    #define USER_PWM_FREQ_kHz                (24.0)

    #define USER_NUM_PWM_TICKS_PER_ISR_TICK        (1) 

     

  • Yep, exactly the same behaviour.

     

  • Hmmm. We just checked the version _07 that will go out tonight and it is fine. Hz command = Hz on scope = expected RPM.  This version _07 doesn't have the additional PWM decimation I posted so there is a chance there is a bug  there....but if you set this to 15-20 KHz with no decimation it shouldn't have any difference.

    I'll try to get on a scope tomorrow and see if I made a mistake with the driver for the decimation, but I'm not seeing any issues when I run with a well known motor. It's behavng the same.

    BTW - some more datas on the hexTronik 5gm motor.

    It is a 4 pole machine

    Rs - 0.88

    Flux = 0.002

    Ls = 16.65uH when ID'd at 16V and 400 Hz (12k RPM)

    Ls = 10uH when ID'd at 16V and 10 Hz (300 RPM)

    PWM at 60 KHz, hardware ADC SOC decimation to 20 KHz inner control

    Must set this #define USER_MAX_ACCEL_rps2                 (8.0)

    to allow it to ramp fast enough during ID if trying to go to 400 Hz

    Can run about 30 kRPM @ 16V.

    We ran it up to 110 kRPM at higher voltage and  the prop cracked :)

     

  • I finally got around to trying it on the kit motor and the Hz command  = Hz on scope (whether I use decimation or not). So I guess it must be a problem with my motor parameters but I'm really surprised that it runs as well as it does if its that far out.

  • Sam,

    I just did some extensive testing with the _6b release, and there are some issues, especially with Motor ID.  I deleted the file from e2e.

    The official _07 just got pushed from our side, so it should replicate tonight. It includes the ADC driver fix but not the new  PWM_TICK.  I suggest for now you go back to the _07.  I'm going to do some testing with _07 tomorrow and decimating just in user.h and see what I find.

    I can show you away to go set-up the PWM driver to do the ADC SOC event directly, but I was trying to get that abstracted into the user.h/drv.c/.h so it's transparent....I'm not sure that worked.

  • Sam,

    Back to your Hz comment, you obviously have set 50% more poles .

    Ex: Using the motor that comes with the DRV8312 kit (which is 8 poles), if I set it up with 12 poles (50% more) then when I command

    1000 RPM, the controller command will see: 1000 RPM * 12 poles / 120 = 100 Hz command

    The voltage waveform should be at 100 Hz and your actuall RPM should = 100 Hz * 120 / 8 poles = 1500 RPM

    In your example you think you are commanding 100 Hz based on the RPM you input, but you are really commaning 66 Hz.

    Make sense?