This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

ADS1219: How to calibrate ADS1219?

Part Number: ADS1219
Other Parts Discussed in Thread: OPA4277

Tool/software:

I'm using an ADS1219 on a project. I've chosen to use the internal 2.048 voltage reference, and I'm feeding it a voltage between -0.25 and 2.0 volts (the -0.25 value is created because I have a Germanium diode protecting the input from lower than -0.3V).

When I feed the device approx 0V (typically around 0.005V) I get -229 (upper 16 bits, shifted down 8 bits) from the ADS1219. When I feed it 2.000V, I get around 29150. I'm really expecting 32000 for the 2V input. According to the spec sheet, there is a calibration option that connects the AinP and AinN inputs to AVDD/2; however, I suspect this is useless as I'm using the internal reference. Nevertheless, if I read the device 8 times and average the values I get (all of them seem to be zero), I get a zero offset to subtract from the result. So that is not helping me at all.

I am using a voltage follower to supply the ADC input (using an OPA4277 opamp) and I measure 1.999 to 2.000 (to four digits) in the input. Any suggestions on why I'm getting 291xx rather than 32000?

  • BTW, 20 SPS, continuous, vref_int, and gain=1

  • Hi Randal,

    According to the spec sheet, there is a calibration option that connects the AinP and AinN inputs to AVDD/2; however, I suspect this is useless as I'm using the internal reference. Nevertheless, if I read the device 8 times and average the values I get (all of them seem to be zero), I get a zero offset to subtract from the result. So that is not helping me at all.

    This is the correct method for finding the input voltage offset of the device. In your case it seems to be very close to zero, and this seems correct as the offset of these devices should be no more than 4uV:

    which is smaller than you LSB if you only care about the upper 16 bits (for 16 bits, Vref = 2.048V, 1LSB = 62.5uV) the offset is significantly smaller than 1LSB so it makes sense to see only zeroes for your voltage offset readings. 

    You can learn more about the details of doing an offset and gain error calibration for ADC systems from the following video in our TI precision labs series: Precision labs series: Analog-to-digital converters (ADCs) | TI.com

    However, I think there is something else going on with your setup. 

    (upper 16 bits, shifted down 8 bits) from the ADS1219. When I feed it 2.000V, I get around 29150. I'm really expecting 32000 for the 2V input.

    Your output code of 29150 (for 1LSB = 62.5uV since only using upper 16 bits) would correspond to a voltage reading of 1.821875 Volts. This seems way too large of an error for an alleged 2V input for the offset and gain error of the device to account for it, which likely means there is some other issue.

    Do you have a schematic you could share of this setup?

    I am using a voltage follower to supply the ADC input (using an OPA4277 opamp) and I measure 1.999 to 2.000 (to four digits) in the input.

    Is this (1.999V-2.000V) what is measured at the input of your OPA4277, or at the input of the ADC? Make sure you are verifying the input voltage at the input pin of the ADC, as there may be something going on with the buffer circuit where the output of the OPA4277 is not exactly the same as the 2V input voltage, and this is what the ADC is measuring.

    Also make sure your digital communications to the ADC are working properly, and that the output code readings do correspond to what is seen on the I2C bus.

    Best Regards,

    Angel