DRV8323: Offset calibration issues

Part Number: DRV8323

I am experiencing an issue running offset calibration on the DRV8323S. When I pull the CAL pin high, wait around 2ms for the driver internal calibration to complete, and do my own calibration by averaging 4096 ADC samples over around 122ms, and pull the CAL pin low again, the voltage at the SOx pin changes, so the offset calibration cannot remove the offset. I am using the MOSFET VDS Sense Mode, mentioned in page 44 and 45 of the datasheet.

I have attached a screenshot showing the CAL pin (CH3) and the SOA voltage (CH1). The CAL pin is pulled high around 12ms after the device is enabled, then the software calibration sequence is performed, then CAL is pulled low and the low side MOSFETs are enabled. The problem is that at this point the voltage at the SOA pin drops a bit, which causes an offset in the measured current. The SOA voltage looks "noisy", this is due to the ADC internal sampling capacitor charging up and is expected.

Entire waveform Zoomed in on part where CAL is pulled low

I have tried the following steps to try to resolve the issue, but with no luck:

  • Using SPI to set the calibration bits
  • Reading back the registers over SPI confirms the registers have been written successfully
  • Set IDRIVE settings to minimum values
  • Doing the calibration with the low side FETs turned on
  • Doing the calibration with CAL pin low and all FETs off (effect is the same as when the CAL pin is high)
  • Calibrating without Rds(on) current sensing and enabling it afterwards
  • Calibrating with and without a motor connected
  • Verified the VREF voltage is stable (3.30V)
  • Verified VCP - VM voltage is stable (11.0V)
  • Verified SNA - SHA voltage (Vds) is zero (0.0mV)
  • 2 different boards both have the same issue

The shift in SOx voltage after calibration happens on all three of the CSAs.

The following commands are sent to the device via SPI before doing the calibration:

drv8323communicate(0b0001101110001000, spi_buf); // unlock registers, set gate drive current for hs

drv8323communicate(0b0010000010001000, spi_buf); // set gate drive current for ls

drv8323communicate(0b0001000000100001, spi_buf); // set 3x pwm mode, clear fault

drv8323communicate(0b0010100000000000, spi_buf); // set ocp deglitch to 2us, ocp level to 0.06v, dead time to 50ns, ocp latched fault

drv8323communicate(0b0011011111100000, spi_buf); // set rdson current sense, 40v/v csa gain

While trying to find solutions, I also noticed that the CSA calibration has more error if I shorten the 12ms delay between enabling the device and pulling the CAL pin high. Increasing the delay past 12ms does not seem to increase the calibration accuracy.

I am using BSZ0901NS MOSFETs to do the Rds(on) current sensing, in case it matters. I have verified that there is no ringing or overshoot on the gate or drain pins of the MOSFETs with 260/520mA IDRIVE setting. The Rds(on) of this MOSFET is 1.7 milliohms at 25degc, and 2.7 milliohms at 150degc, and I am planning to perform software gain compensation using on a temperature sensor.

From the tests I performed, it seems that the internal shorting of the amplifier inputs is not working properly (as it also happens when all FETs are off and the amplifier is shorted internally), which is causing this issue. Unfortunately, I have tried many workarounds which have all failed, and I can't think of any more.

Is this the root cause of the problems I am seeing, and can anyone suggest ways to fix this problem so I can properly perform offset calibration?

  • Hi Andrew, 

    Thank you for your question and using our forum!

    Please let me look into the information you've provided and follow-up this week with a further response on the matter. 

    Best Regards, 

    -Joshua

  • Hi Joshua,

    Any updates? I still am not able to perform offset calibration properly.

    Regards,

    Andrew

  • Hi Andrew,  

    I apologize for the long response time as this week is a major US holiday.  However,  I am still looking into this inquiry and will provide feedback by Monday.  

    Please look forward to a further response.

    Best Regards, 

    -Joshua 

  • Hi again Andrew,

    Providing a status update to let you know I'm still discussing this internally and aim to follow-up before the weekend, especially since it has been a long period of time. 

    Please look forward to a response.

    Best Regards, 

    -Joshua

  • Hi Joshua,

    Any progress on this issue?

    Regards,

    Andrew

  • Hi Andrew, sorry for the long delay.

    Please expect my response within the next day.

    Thank you dearly for your patience. 

    Best Regards,

    -Joshua

  • Hi Andrew, 

    I apologize for my leave of absence and wanted to follow-up on this inquiry: 

    We have discussed offset calibration behavior in this E2E thread here: (https://e2e.ti.com/support/motor-drivers-group/motor-drivers/f/motor-drivers-forum/921513/drv8323-drv8323-auto-offset-calibration ) and believe that the advice given to store this offset in MCU and correct for any abnormalities might be the best path forward.  

    I would also like to note that offset calibration values will differ from calibration to calibration regardless,  and this should not lead to many issues if only doing this in unchanging system conditions. 

    Best Regards,

    -Joshua

  • Hi Joshua,

    I am already storing the value of the MCU ADC readings while in the offset calibration phase, and subtracting this offset afterwards to get the current.

    However since the output of the CSA on the DRV8323 changes (by around 10mV but varies run to run) when the calibration mode is exited and with no current flowing through the phase, this calibration method is not effective at removing the offset and some residual offset remains.

    The system conditions do not change in the tests performed, so this is not the cause of the offset calibration issue.

    Assuming 10mV output voltage offset of the CSA after performing calibration, the offset in the measured current with 40x gain is 10mV/40/1.7milliOhm = 425mA. I am aiming for less than 100mA offset in my application. Is this possible to achieve with the DRV8323? Do you have any advice on how to further reduce the offset?

    Regards,

    Andrew

  • Hi Andrew,

    I apologize for the delay, Joshua is out of office till the end of next week. He will provide an update once he is back.

    Regards,

    Anthony Lodi

  • Hi Andrew, 

    Thank you for your patience in my leave.

    This is a very interesting circumstance.  One more additional suggestion I have is to write the CSA_CAL_X pins high and low instead of toggling high/low on the external input (if not doing so already), and try running the calibration for a much longer time than 12ms (try 1-2s to observe if any change is noticeable) 

    The input offset should only be within ±4mV, so I do wonder if there is additional noise added through the circuit during the calibration. Is the entire system isolated (only driver connected/powered) on the board when this calibration is taking place? That could be another potential factor. 

    Best Regards, 

    -Joshua

  • Hi Joshua,

    I have tried running the calibration for a longer time (1 second - 5 seconds), but the behavior is the same as before, the output voltage of the CSA still shifts when the calibration mode is exited. I have tried the CAL pin and setting the CSA_CAL_X bits in the register, and doing both simultaneously, but this also doesn't seem to have any impact on this behavior.

    Unfortunately it is not possible to fully isolate the driver from all noise sources, but I have tried my best to minimize the number of powered components. The only things which are connected/powered during the test are the buck converters for the microcontroller, the LDO for the microcontroller analog supply, and the debugger connecting the microcontroller to my computer. Even if one of these were introducing additional noise to the CSA, it would not explain why the output shifts when calibration mode is exited, because after averaging out any noise the shift is still visible.

    Regards,

    Andrew

  • Hi Andrew, 

    Thank you for your response and clarifications.

    It is seeming likely that the offset induced by this system is unable to be lowered past this threshold by typical means, and the best course of action is to automate the MCU storing of the offset that occurs on device calibration each time and shift the CSA outputs by an absolute amount as discussed previously.

    I believe you have tried storing this offset to MCU before,  but what was the lowest offset threshold you were able to achieve previously?

    Is it possible in your code to take a 1ms or less average of the calibration offset each cycle and have the MCU adjust the CSA measurements?  While adding total time to your system setup, this might be the most effective course of action. 

    Best Regards,

    -Joshua

  • Hi Joshua,

    Currently at each startup I am averaging 4096 samples while in the calibration mode by setting CSA_CAL_X (this process takes around 122ms), and subtracting this from the ADC readings of the CSA to get the current. I am not storing any calibration values between power cycles.

    The average value of the ADC average obtained during the offset calibration process is:

    Chip 1: 2048 (SOA), 2053 (SOB), 2032 (SOC)

    Chip 2: 2056 (SOA), 2047 (SOB), 2061 (SOC)

    Chip 3: 2034 (SOA), 2042 (SOB), 2062 (SOC)

    Chip 4: 2045 (SOA), 2027 (SOB), 2052 (SOC)

    Chip 5: 2064 (SOA), 2051 (SOB), 2052 (SOC)

    Chip 6: 2022 (SOA), 2055 (SOB), 2029 (SOC)

    The residual offset that remains after doing offset calibration, obtained by subtracting the ADC reading from the average value during calibration above:

    Chip 1: -12 (SOA), -15 (SOB), -11 (SOC)

    Chip 2: -16 (SOA), -13 (SOB), -11 (SOC)

    Chip 3: -15 (SOA), -13 (SOB), -14 (SOC)

    Chip 4: -12 (SOA), -14 (SOB), -13 (SOC)

    Chip 5: -9 (SOA), -12 (SOB), -11 (SOC)

    Chip 6: -17 (SOA), -13 (SOB), -12 (SOC)

    Note the above results are in bits, using a 12 bit ADC with 3.3V reference voltage. The residual offset has been obtained by taking the average of 1000 samples. If I power cycle multiple times, the residual offset seems to remain relatively constant, fluctuating within a range of around ±5 (±4.02mV).

    One possible solution might be to hard code a further offset of 12 after calibration, which should reduce the offset significantly. I will have to verify the stability of this solution with varying temperature and input voltage, but it seems promising so far. The residual offset could also vary between batches of chips, but currently I have no way to test that as all my chips are from the same batch. Are there any other potential problems that you see with this approach?

    Regards,

    Andrew

  • Hi Andrew, 

    Thank you for the response.  

    This does look like a promising approach, as an offset of ± 4mV - 5mV is what we typically would expect from the higher bounds. 

    As you have already mentioned, the potential cautions regarding these approaches is that batch to batch variation (while small) is possible,  but more importantly temperature variations can affect the calibration more easily and may shift the operations.  But overall,  I believe this to be the best approach moving forward and am not aware of many other drawbacks.

    I look forward to hearing about your further testing and results and hope you are able to.mobe forward quickly. 

    Best Regards,

    -Joshua