This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

[FAQ] ADS131B26-Q1: How do I calculate the achievable accuracy and resolution for my BMS (battery management system) current shunt measurement?

Part Number: ADS131B26-Q1

Hello,

I am developing a battery management system (BMS) using the battery pack monitor, ADS131B26-Q1.
I need to measure the battery pack current using a shunt resistor with very high accuracy and resolution.

My requirements are the following:

  • Shunt value: 50μOhm
  • Update rate: 1ms
  • Resolution: 1mA
  • Noise: 20mARMS
  • Measurement range 0A to 30A: Offset error ≤30mA
  • Measurement range 30A to 1000A: Offset error ≤30mA, relative error ≤0.1% (excluding the error of the shunt)
  • Temperature range: -20°C to +60°C

Could you please help me to analyze if the ADS131B26-Q1 can meet those requirements?

Regards,
Joachim Wuerker

  • Hello,

    all you need for a high-level assessment, to see if the ADS131B26-Q1 can meet your current shunt measurement accuracy and resolution requirements, are the following parameters from the device datasheet for ADC1y (y = A or B):

    • Offset error, offset drift
    • Gain error, gain drift (including error and drift of the internal voltage reference)
    • Input-referred noise

    First you need to determine which gain to use for ADC1y.
    ADC1y offers a full-scale range FSR = ±VREFy / GAIN1y.
    The ADC1y input will see the largest differential input signal (VADC1y_IN) at the maximum current through the shunt.
    VADC1y_IN (max) = 1000A × 50μOhm = 50mV
    That means the maximum gain that can be used in this application is Gain = 16, which yields a FSR = ±1.25V / 16 = ±78mV.

    Next you can calculate the so-called LSB (least significant bit) size.
    The LSB size tells you how much the input signal needs to change in order to change the ADC output by one code.
    LSB = (2 × VREFy) / (GAIN1y × 224) = (2 × 1.25V) / (16 × 224) = 9.3nV
    You can translate that number to how much the current through the shunt needs to change in order to change the ADC output by one code by dividing the LSB size by the shunt value: 9.3nV / 50μOhm = 0.19mA
    That means every code of ADC1y represents a change of current through the shunt of 0.19mA. This meets your requirement of 1mA resolution.

    The LSB size is the smallest signal the ADC could theoretically resolve. However the actual measurement performance is limited by the ADC noise.
    If you look up the input-referred noise of ADC1y in the noise table of the datasheet for gain = 16 and a data rate of 1kSPS, which is 0.65μVRMS, you realize that the noise is quite a bit larger than the LSB size of 9.3nV.
    What that means is, that on average the output of ADC1y varies by (0.65μVRMS / 9.3nV) = 70 codes when a noise-free DC input signal is applied.
    You can again translate the 0.65μVRMS into the current shunt measurement domain, by dividing the input-referred noise number by the shunt value: 0.65μVRMS / 50μOhm = 13mARMS.
    The input-referred noise of 13mARMS meets your noise requirement of 20mARMS.

    Next, let's look at your offset error requirements.
    In order to achieve an offset error of 30mA, the ADC needs to achieve an input-referred offset error of 30mA × 50μOhm = 1.5μV.
    This is a very demanding requirement. Fortunately the ADS131B26-Q1 has the capability to achieve such low offset error.
    In principle there are two ways how to achieve such low offset error with ADS131B26-Q1:

    • Implement periodic self-offset calibration of ADC1y
    • Leverage the global-chop mode of ADC1y

    I am using global-chop mode in this discussion, because it doesn't require any special interaction between the host and the device and it doesn't interrupt the current measurement.
    With global-chop mode enabled, ADC1y specifies an offset error across temperature of 1.5μV (max), which exactly matches your requirements.
    One downside of using global-chop mode is, that it reduces the effective output data rate by a factor of ~3, as described in the datasheet. That means, in order to still achieve an update rate of 1ms, you need to reduce the OSR1y setting of ADC1y from 4096 to 1024.

    Lastly, you need to check if the ADS131B26-Q1 can meet the relative error requirement of 0.1% (excluding the error introduced by the shunt resistor).
    In most common BMS, a gain error calibration is performed at least at room temperature at the end of the production line. For that reason, I am assuming that the initial gain error of the ADC is calibrated to 0%. You only need to consider the gain drift of the ADC in that case then.
    The gain drift specification of 20ppm/°C (max) in the ADS131B26-Q1 datasheet includes the gain drift of the complete signal chain integrated in the device (i.e. the gain stage, ADC, and voltage reference).
    A gain drift of 20ppm/°C introduces a gain error over your specified temperature range of:
    GE = (25°C - (-20°C)) × 20ppm/°C = 0.09%

    Overall it seems like the ADS131B26-Q1 meets your demanding requirements.

    Regards,
    Joachim Wuerker