This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

ADS1222 Self calibration random offset

Other Parts Discussed in Thread: ADS1222, ADS1220

I've given up on this chip but worth to report the issue for others and for TI to consider. 

The ADS1222 is a 24 bit device, but has relatively high noise and high offset (as confirmed by specification sheet).

With software averaging, the noise can be eliminated, and the offset can be calibrated out by micro-controller. The noise required about 30 samples for averaging. 

With this approach I found the ADS1222 is very stable with time and temperature.I had a nice result and very happy with the circuit. 

But ... every time I turned on the device, the offset changed, by a random amount. The random shift was bounded by a range of about +/-50 LSB, which is too high for my application. Once the new offset was established, the device was again very stable. 

After long investigation (always assuming it was my fault) I finally found that the device's self calibration function was assigning a new, random offset at power on. This could be confirmed by requesting a self-calibration outside of power on, which caused a random offset shift similar to that found at power on.

It appears from the timing diagrams that the device only takes one or at most two measurements for self-calibration. These 1 or 2 samples are not enough to overcome the high internal noise, creating a random offset. 

While I understand this is a lower performance device compared to others in the ADS 12xx 24 bit series, there is nothing in the specification sheet to make you aware of this potential issue. Countless hours are wasted tracking this down, including multiple ICs, changes in PCB layouts, shielding, capacitors, with filtering, without filtering, increasingly higher quality components, changes in sampling rates and other timing, monitoring and even controlling the temperature of the circuit. With all off these changes, the problem remained the same - random shifts in a range of +/-50LSB at power on, and stable after that.

Since the device automatically self calibrates at power on, it is impossible to eliminate the random shift, making the device effectively an 19 bit ADC. To eliminate the problem, I would need to create my own external self-calibration, for example a high quality relay to short the inputs.

In the end I gave up and used a front end differential amplifier with 50x gain to overcome this limitation. 

My question to TI is - why is this issue not raised in the specification sheet?

And what is the point of 24 bit resolution if the self calibration can only handle 19 bits?

Note: I am aware there are higher spec devices in the range. That is not the issue. The issue is why offer a 24 bit with only 19 useful bits, and worse provide no warning in the specification sheet? 

  • Peter,


    I'm sorry that this device didn't work out for you. We do try to point out the performance of the device with the datasheet because it doesn't help anyone to misrepresent parts. People that become frustrated with parts or support don't tend to become repeat customers.

    That being said, it is said on the front page of the datasheet that this device has a 20-bit effective resolution. We also give multiple curves to show the offset and noise of the device, along with listings for the min and max in the electrical characteristics table (as you've noted in your post).

    The self-calibration done on start-up (and by command) does help many. If your offset varies about ±50 codes, this translates to about ±15uV of offset. I believe that the device averages 16 readings to determine the offset calibration (and 1 reading for the gain error calibration). It would have been better if the device could have skipped the power-on calibration for your application.

    In our product line, we've listed data converters by the number of bits of output data. This is a 24-bit converter, because it puts out 24 bits. In general, no one searches specifically for a 19-bit or 20-bit converter. I would also note that the output data bits below the noise floor is still useful information, averaging will get better performance, provided that the reference performance is comparable. There are very few 24-bit converters that will give you a 24 bit noise floor.

    Generally, we'd like people to come to us early, as they are beginning the design. In that case, we're able to better help people put together a system within their specifications that suits their needs. If you look through the forum, we've helped quite a few people with technical problems (I would also note that this is a good place to look for similar problems).

    Again, I'm sorry that this device didn't work out for you. Historically, the ADS1222 has been a successful device for us. If you're open to looking at another TI device, I'd recommend looking at the ADS1220. It has better noise performance and an integrated PGA. This device does not have a specific offset calibration command, but it does have the option of internally shorting the inputs so that you can take a reading, store it, and subtract it from subsequent readings.

    If you have any further questions, please feel free to post back to the forum.


    Joseph Wu
  • Hi Joseph,

    Thanks for the detailed answer.

    Some points to note:

    As you pointed out, the offset change is equivalent to 15uV.

    The key misleading specification in my opinion is the statement that says the offset drift is 0.2μV/°C. This implies that the offset should be relatively stable, and my tests confirmed this to be correct. I would suggest that the TI specification requires some comment or note that while the offset may be stable, it may shift at power on and with self calibration by up to 15μV.

    In my post I said it was 19 bits, but in actual case the effective resolution with offset shift of 50 LSB falls is a loss of between 5 and 6 bits resolution. So in effect the performance is between 18-19  bits. I said 19 bits to be on the generous side :).  

    No one expects 24 bits from a 24 bit ADC, but 18 bit resolution (18.5bit?) from a 24 bit ADC is somewhat a stretch. 

    If the device could have reached the headline spec of 20 bits, it would have been borderline OK for my application.

    Thanks anyway. 

    The main point is to give a heads up to future designers to watch out for the offset shift with the self-calibration function. I wasted 80 hours of design time chasing this down.