Other Parts Discussed in Thread: TPS22810
We’re testing the ADS1220 circuit with a 100 ohm test resistor, which simulates a platinum RTD probe. In our circuit, this corresponds to an indicated temperature of 0 °C.
With the resistance outside the test chamber (to keep the value constant), we change the ambient temperature of the ADS1220 circuit, with the following results:
When the ambient temperature is between 22 °C and 40 °C, the indicated temperature remains 0 °C, as expected.
However, at – 30 °C ambient, the indication is -0.3 °C instead of 0 °C.
We’re using the ADS1220 in the 4-wire RTD circuit configuration described in the datasheet on Page 56 (ADS1220 SBAS501C). Rref = 500 ohms, ± 0.2 ppm / °C. In the test, we took care to emulate the probe we use in the field (proper shielding, etc). Our circuit also has the TPS22810, to ensure the power-on voltage ramp rate meets ADS1220 requirements.
What settings should we examine to eliminate the error vs ambient temperature?
Thanks
Viktorija