This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TLA2021: how to calculate voltage sensing accuracy

Part Number: TLA2021
Other Parts Discussed in Thread: TLA2024, ADS1015, ADS1115

Hi, Team

My customer is planing to use low cost ADC with I2C to sense battery voltage. Their requirement is:

1. Sensing input range : 3.0V ~ 5.0V

2. Output accuracy: +/-5mV full range full temperature. 

I think TLA2021 should meet requirement and didn't know how to calculate voltage sensing accuracy.

Could you help to share how to calculate it? Thank you.

  • Dong,


    If you want to measure a voltage from 3.0V to 5.0V, then you would use the ±6.144V full-scale range setting on the ADC.

    The error for the device is primarily determined by the offset and gain error of the device. The typical offset error is 1 LSB, while the typical gain error is 0.05%

    At a measurement of 0V, there would be no gain error so the typical error is the offset of 1 LSB. This is 3mV for this FSR setting. As the input signal gets larger, the gain error becomes a larger. At a measurement of 5V, the typical gain error of 0.05% becomes 2.5mV, but also has the 3mV offset error well. Combined, this is a total of 5.5mV of error.

    Note that this error is just the typical value and you may be able to consider that a standard deviation in the error measurement. The TLA2024 does not give a maximum error because it is cost-optimized device. I would note that there is also a non-linearity error, gain and offset drift, that will add to the error. These are usually minor contributors compared to the offset and gain error.

    If the customer needs an limit for the expected error, the ADS1015 gives maximum listings for the offset and gain error. However, the ADS1015 error is similar. If the error needs min-max listings that are smaller, they may need to go with the ADS1115.


    Joseph Wu