This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Motor startup delay DRV8301

Other Parts Discussed in Thread: INSTASPIN-BLDC, CONTROLSUITE, DRV8301, DRV8302

Hi TI,

I have a question regarding the startup procedure of a  sensorless BLDC-motor on the DRV8301-HC-EVM kit. It seems that it takes a while for the motor to start spinning from the point the "enable" signal goes high; roughly 150-200 ms. See attached picture. (The internal signals, reference, enable and speed, from the F28035 are sent on the CAN-bus and evaluated in a CANbus analyzer program. The hall sensors fitted to our particular motor are solely used for logging the speed (as shown); the control is sensorless.)

This period of time is from what I've found out more or less the same whether you run in velocity, current och cascaded loop. I have tried to adjust the parameters in the GUI but have not been able to see any improvement. I have also studied the code in CCS4, "BLDC_Int_GUI_DRV83xx.c", but I cannot see where the time is consumed. I have even configured a CPUtimer to measure the time elasped in the statement in the A1 function starting with 

if((Gui.EnableFlag == TRUE) && (RunBLDC_Int == FALSE))

which from what I can see is the block of code which is run when the "enable" button is pressed. This takes roughly 1/100th of a second - negligible. My guess was that it was there the time was consumed since I cannot really find any other obvious startup code piece.

The application I am working on demands a short startup time, preferably well below 100 ms. I have seen sensorless motors/controllers with such startup times but those lacks several of the benefits the TI kit has. Which is why I would like to know if it is possible to decrease the startup time in the existing code.

Best regars

Anton

If this question should have been posted elsewhere I appologise and I encourage any moderator to move it to a more suitable location if that is the case.

  • most of this delay is because in the InstaSPIN-BLDC project there is an ADC offset calibration done each time.  In our InstaSPIN-FOC solution we built in the logic to save the current calibration and to store the calibration in the user.h file for immediate loading from the project.

    you can do something similar with the InstaSPIN-BLDC. To start, you can change the code to skip calibration, but you should get the actual values and hard code them in first.  As you take your SW to a product you may want this calibration function to run based on a button being pushed or something - and have that store the calibration values in Flash or EEPROM, then load from that in the future.  That way you have a calibration for each unique piece of hardware.

     if(CALIBRATE_FLAG)
     {
    // ------------------------------------------------------------------------------
    //    ADC conversion and offset adjustment
    // ------------------------------------------------------------------------------
      iqVaIn =  _IQ15toIQ((AdcResult.ADCRESULT1<<3))-InstaSPIN_BLDC1.VaOffset;
      iqVbIn =  _IQ15toIQ((AdcResult.ADCRESULT2<<3))-InstaSPIN_BLDC1.VbOffset;
      iqVcIn =  _IQ15toIQ((AdcResult.ADCRESULT3<<3))-InstaSPIN_BLDC1.VcOffset;
         iqIA=(_IQ15toIQ(AdcResult.ADCRESULT4<<3)-IA_offset)<<1;
         IDCfdbk=(_IQ15toIQ(AdcResult.ADCRESULT5<<3)-IDC_offset)<<1;
    // ------------------------------------------------------------------------------
    //  LPF to average the calibration offsets
    //  Use the offsets calculated here to initialize BemfA_offset, BemfB_offset
    //  and BemfC_offset so that they are used for the remaining build levels
    // ------------------------------------------------------------------------------
         InstaSPIN_BLDC1.VaOffset = _IQmpy(cal_filt_gain,iqVaIn) + InstaSPIN_BLDC1.VaOffset;
         InstaSPIN_BLDC1.VbOffset = _IQmpy(cal_filt_gain,iqVbIn) + InstaSPIN_BLDC1.VbOffset;
         InstaSPIN_BLDC1.VcOffset = _IQmpy(cal_filt_gain,iqVcIn) + InstaSPIN_BLDC1.VcOffset;
         IA_offset = _IQmpy(cal_filt_gain,iqIA) + IA_offset;
         IDC_offset = _IQmpy(cal_filt_gain,IDCfdbk) + IDC_offset;

    // ------------------------------------------------------------------------------
    //  force all PWMs to 0% duty cycle
    // ------------------------------------------------------------------------------
      PHASE_A_ON;
      PHASE_B_ON;
      PHASE_C_ON;

      EPwm1Regs.CMPA.half.CMPA=0; // PWM 1A - PhaseA
      EPwm2Regs.CMPA.half.CMPA=0; // PWM 2A - PhaseB
      EPwm3Regs.CMPA.half.CMPA=0; // PWM 3A - PhaseC 
      
       CALIBRATE_FLAG++;
       CALIBRATE_FLAG &= CALIBRATE_TIME;
     
     }

  • Thank you very much Chris! I will investigate the possibility to store the offset values and see how it works.

    Thanks once again, very much appreciated.

  • BTW - this is already taken care of and explained if you work through the standard project.

    Ex: C:\ti\controlSUITE\development_kits\DRV8312-C2-KIT_v128\InstaSPIN_BLDC

    It is only in the GUI project where this calibration runs all the time from start-up.

  • Hi again,

    I figured out the offset values and assigned 

    InstaSPIN_BLDC1.VaOffset 
    InstaSPIN_BLDC1.VbOffset 
    InstaSPIN_BLDC1.VcOffset 
    IA_offset
    IDC_offset

    with these values statically and commented out the entire if-statement you showed above. Unfortunately it didn't make any major difference; I still have a dead-time of ~180ms from the point the enable signal goes high until we actually start spinning. What else is taking time during (or more specifically, before) the startup?

    Any input is very much appreciated.

    Best regards 
    Anton

  • where are you starting your measurement from?

    meaning, are you including the power up  / boot up time of the processor, or are you timing just from the enable flag being set to the first PWM pulse starting?

  • It's the time from when the enable flag goes high until the motor actually starts spinning, see attached picture in my first post. So it's not including the pwr/boot up time for the processor but it is including the enabling of the DRV8301 driver. There is a short delay after enabling the driver:

    GpioDataRegs.GPBSET.bit.GPIO39 = 1;

    DELAY_US(50000); //delay to allow DRV830x supplies to ramp up

    but from what I can see there's nothing that should take a substantial amount of time apart from this delay which isn't extremely long in the first place (50ms).

  • On another, different, note;

    How much of the - in terms of actually running the motor - BLDC code is "under the hood"? I.e how much is unavailable for me to study? For example I can't even find how the ADC interrupt is configured. 

    BR Anton

  • Only the commutation estimation is in the library. 

    Regarding the timing, my first recommendation is to stop using the GUI project (which I mentioned above) .

    That 50ms delay is for the DRV8301.  When the EN_GATE pin is de-asserted the gate driver portion of the DRV8301 is basically off.  This includes any of the internal supplies used by the gate drive stage.  When EN_GATE is asserted the DRV8301 internal supplies need to ramp up before the chip will perform any of its gate drive functions or respond to SPI commands.  Back when we were developing this there was no published spec. for power supply ramp up time so we did some testing and settled on 50ms as a safe delay from the time we enable the chip to when we start sending SPI configuration data.  Maybe there is a published spec. now but we haven’t checked.  Also, I think we were being pretty conservative when we picked 50ms.  If someone watched the DRV8301 supplies come up they may be able to find a less conservative delay that works for them.  

     

    Once the DRV8301 is powered up there is going to be a little more of a delay while the SPI commands are sent to configure it.  I’ve never quantified that delay so I don’t know how long it takes.  A possible work around for the delays associated with enabling and configuring the DRV8301 would be to just do it once at power up.  The drawback would be if you needed super low power during the time the motor is off. 

     

     I’m not sure why commenting out the calibrate routine made very little difference.  That routine should run for 2047 cycles of the 20kHz ISR which works out to 102ms. 

     

    Also, The EnableFlag variable is checked in the A1 background task which the comment says executes every 1ms.  I’ve never verified that but it should be pretty quick.  This EVM is running with a Crosshairs GUI.  There is some Crosshairs code that runs in a function called ServiceRoutine() which also runs as a background function....this takes some time and adds some delays. again, use the non GUI project. 

     

  • I've now configured the non-GUI project the way I want it and changed so that the DRV8301 is enabled from power-up. The performance is way better, the motor gets up to speed within 60-80 ms which is acceptable in our application. 

    Thanks for your feedback Chris.

    BR Anton

  • Dear Sir

    Is the DRV8302 also need the 50ms for startup?

    Regards,

    Hank

  • Not sure, please ask on motor drive forum

    http://e2e.ti.com/support/applications/motor_drivers/default.aspx