This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

CCS/BOOSTXL-DRV8323RH: Changes in Instaspin-FOC to use DRV8323RH?

Part Number: BOOSTXL-DRV8323RH
Other Parts Discussed in Thread: DRV8301, MOTORWARE, , DRV8323

Tool/software: Code Composer Studio

Hi,

I am trying to migrate from DRV8301+F28027F to DRV8323RH+boostxldrv8323_package_mw.

I found the advice on this forum on how to amend Motorware with boostxldrv8323_package_mw.zip files. Which I successfully did.

The problem that I am facing now is that this package designated for SPI communication between MCU and DRV8323RS.

What changes in hal.* or lab11a.c should I make to use DRV8323RH?

Currently, Instaspin is sort of running, sensing the voltage, accepting the Ref Speed etc but clamps ENABLE pit to the ground.

The nFAULT pin is HIGH (which is good).

Regards,

Edward.

  • You have to change bunch of HW and SW for DRV8323RH. Though I've not tested with these modifications, it might work well. 

    [BOOSTXL-DRV8323RH]

    1) Populate the following three capacitors for phase voltage feedback if they are not pouplated on the board.

    -  C9, C10, C11 -> 0.1uF

    2) DRV8323RH’s GAIN pin(32) should be Hi-Z to be 20V/V which is default setting of DRV8323RS’s firmware. To do this, you need to carefully cut the line between DRV8323RH GAIN pin(32) and J4 pin(18). This line is located on the pcb bottom side. Pls refer to the picture below.

    [hal.c]

    1) Change GPIO16 setting from SPI function to GPIO OUTPUT for DRV8323’s MODE pin setting in HAL_setupGpios()

     - before: GPIO_setMode(obj->gpioHandle,GPIO_Number_16,GPIO_16_Mode_SPISIMOA);

     - after:

     GPIO_setMode(obj->gpioHandle,GPIO_Number_16,GPIO_16_Mode_GeneralPurpose);

     GPIO_setLow(obj->gpioHandle,GPIO_Number_16);  //default MODE = 6PWM

     GPIO_setDirection(obj->gpioHandle,GPIO_Number_16,GPIO_Direction_Output);

    [hal.h]

    1) Disable the define of DRV8323_SPI as following to remove all of SPI functions in lab11a.c

     - before: #define DRV8323_SPI

     - after: //#define DRV8323_SPI

    [drv8323.c]

    1) Remove the following lines in DRV8323_enable() function because SPI function is not working in DRV8323RH

    while(((DRV8323_readSpi(handle, Address_Status_0) & DRV8323_STATUS00_FAULT_BITS) != 0) && (enableWaitTimeOut < 1000))

     {

         if (++enableWaitTimeOut > 999)

         {

             obj->enableTimeOut = true;

         }

     }

    [lab11a.c]

    1) Add the following code to the right before of for(;;) to enable driver

     // turn on the DRV8323 if present

     HAL_enableDrv(halHandle);

      // Begin the background loop

     for(;;)

    {

      :

      :

    }

    For reference, you might need to modify an external resistor values on DRV8323RH BOOSTPACK if you want to change the setting of IDRIVE and VDS set values. But I think it should be operated with the default setting in normal condition.

  • Hi Steve,

    Thank you very much. It was exactly what I was looking for, even better.

    I will try it and report back on results with schematics etc.

    Regards,

    Edward.

  • Hi Steve,

    Thank you very much!

    I have made a custom PCB with F28027F and DRV8323RH, performed changes you suggested and it works!

    I still have some questions regarding the tuning.

    I am controlling a relatively small, small, low inductance hobby-grade motor B28-47-26S. 

    24V DC power supply.

    Motor works at 5 to 9V AC (phase to phase).

    I could not run lab02c to determine motor parameters.

    lab02b worked. Parameters where determined, different every time, but I found the set that works for me with lab11a.

    Motor is getting hot very soon on with no load attached.

    It was not like that with DRV8301.

    What should I look at?

    Reduce the ratio of the voltage dividers for VSENx?  (and user.h parameters accordingly?)

    Reduce the DRV8323 output current limits?

    Increase sence resistors? (and  user.h parameters accordingly?)

    Change (not sure what direction) PWM frequency?

    Or something else?

    Thank you for you support.

    Hope, when this project strikes it will be a good showcase ;).

    Regards,

    Edward.

  • Did you check the customer board with lab1b and lab1c ? If you didn't yet, I recommend you test the labs at first for checking the hardware integrity.

    You should carefully design feedback circuits for low inductance and high speed motors like yours. The scale ratio of voltage and current feedback should be optimized to maximize ADC input range.

    PWM frequency also should be enough high to minimize phase current ripple for low inductance motors with the following two parameters.

    #define USER_PWM_FREQ_kHz                (30.0)   //30~45KHz

    #define USER_NUM_PWM_TICKS_PER_ISR_TICK (3) //ISR frequency might be 10~15KHz

    Here is summary.

    1. Rechecking a feedback circuit whether a scale ratio is optimized or not. (120% of max value will be a good for initial design)  

    2. Rechecking user.h parameters which are related with user's hardware such as following defines  

      USER_ADC_FULL_SCALE_VOLTAGE_V

      USER_ADC_FULL_SCALE_CURRENT_A

      USER_VOLTAGE_FILTER_POLE_Hz

    3. Set PWM frequency as high as possible for low inductance. (But make sure the control frequency should be limited <15KHz for F2802xF) 

    4. Before closed loop tests, hardware integrity should be checked with voltage open-loop test(lab1b)

    5. If voltage and current feedback is OK by lab1b, current control also need to be checked without using the observer(FAST). lab_1c is doing this. For lab_1c, you can use the motor parameters in the motor datasheet.

    6. If all above steps are done without any issue, you can run motor identification with lab2c. There are some parameters in user.h for motor identification. Especially for high speed motor,  USER_MOTOR_FLUX_EST_FREQ_Hz should be high enough to get BEMF at that frequency.

    7. You don't need to care about motor identification if it does not work properly, even if all procedures from 1 to 5 are good.  For reference, motor identification might not work sometimes with very special motors. In that case, if you have motor datasheet, you can skip the motor identification and run lab10a or 11a with motor parameters in motor datasheet.

    I hope this guide help to resolve your issues.