Other Parts Discussed in Thread: ADS1222, ADS1220
I've given up on this chip but worth to report the issue for others and for TI to consider.
The ADS1222 is a 24 bit device, but has relatively high noise and high offset (as confirmed by specification sheet).
With software averaging, the noise can be eliminated, and the offset can be calibrated out by micro-controller. The noise required about 30 samples for averaging.
With this approach I found the ADS1222 is very stable with time and temperature.I had a nice result and very happy with the circuit.
But ... every time I turned on the device, the offset changed, by a random amount. The random shift was bounded by a range of about +/-50 LSB, which is too high for my application. Once the new offset was established, the device was again very stable.
After long investigation (always assuming it was my fault) I finally found that the device's self calibration function was assigning a new, random offset at power on. This could be confirmed by requesting a self-calibration outside of power on, which caused a random offset shift similar to that found at power on.
It appears from the timing diagrams that the device only takes one or at most two measurements for self-calibration. These 1 or 2 samples are not enough to overcome the high internal noise, creating a random offset.
While I understand this is a lower performance device compared to others in the ADS 12xx 24 bit series, there is nothing in the specification sheet to make you aware of this potential issue. Countless hours are wasted tracking this down, including multiple ICs, changes in PCB layouts, shielding, capacitors, with filtering, without filtering, increasingly higher quality components, changes in sampling rates and other timing, monitoring and even controlling the temperature of the circuit. With all off these changes, the problem remained the same - random shifts in a range of +/-50LSB at power on, and stable after that.
Since the device automatically self calibrates at power on, it is impossible to eliminate the random shift, making the device effectively an 19 bit ADC. To eliminate the problem, I would need to create my own external self-calibration, for example a high quality relay to short the inputs.
In the end I gave up and used a front end differential amplifier with 50x gain to overcome this limitation.
My question to TI is - why is this issue not raised in the specification sheet?
And what is the point of 24 bit resolution if the self calibration can only handle 19 bits?
Note: I am aware there are higher spec devices in the range. That is not the issue. The issue is why offer a 24 bit with only 19 useful bits, and worse provide no warning in the specification sheet?