I’m currently using the ADS122C04 to ratiometrically measure two 2-wire RTD inputs. My circuit is below:
Rref is a 0.1% resistor. I looked at the voltage across my “test” RTDs (resistors) with an Oscope and it does look a bit noisy (maybe reduce the frequency of my filter?), but I would think that when I tried it with the highest sample rate, the final ADC output would have been at least more similar to the “ideal” value, but it was exactly the same as with 20SPS. For example, with a 100 ohm resistor, I’m measuring a voltage across the test resistor of 99.2 mV with a multimeter. My output code in hex is 1D1213 (1905171 in decimal). I then reverse calculate out the voltage drop across the resistor to be 93.7mV (which in temperature land is -16 deg C, as opposed to the 0 deg C it is supposed to be-->BIG ERROR). I also tried increasing the gain to see if that would reduce the error—no such luck. I’m not sure if my calculation is incorrect, or some register setting is incorrect, or if the noise is affecting it or what. The way I calculate it is:
VDIFF IN = [(2 * Vref/gain)/224] * code in decimal
I have set the registers as follows:
Config Reg 0: 0x04 (AINP = AIN0, AINN = AIN1, Gain = 4, PGA Enabled)
Config Reg 1: 0x02 (20SPS, Normal Mode, Single-shot, External Ref, Temp mode disabled)
Config Reg 2: 0x06 (Counter disabled, Data Integrity disabled, Current sources off, IDAC 1000uA)
Config Reg 3: 0xA0 (IDAC1 à REFP, IDAC2 disabled)
I have read back the registers and the values returned are correct, so I’m sure the I2C interface is working correctly. Any help you could provide would be appreciated and if I can answer any questions that make my setup more clear, please let me know.