I am using the ADS124S08 to do a 4-wire RTD temperature sensor measurement. The measurement that I am reading out from the device after a conversion is consistently off by ~1% of the value under test. For instance, a resistance of 50 ohms is read as ~50.5 ohms and a resistance of 150 ohms is read as ~151.5 ohms. The test setup uses calibrated RTD simulators that we have independently verified to be accurate.
The general setup includes an excitation current on AIN0 with measurement on AIN1 and AIN2. There is also a reference resistor in series with the measurement resistor with the reference REFP0 and REFN0 being used. The reference resistor used is a .1% 180 ohm resistor. This can be seen in the schematic below.
My initialization of this device includes setting the following registers and commands:
- Command RESET
- Check for Device Ready Flag in the STATUS reg
- IDACMAG-> 06h (sets IDAC magnitude to 750uA)
- REF-> 22h (disable reference monitor configuration, disable positive reference buffer bypass, enable negative reference buffer bypass, reference input selection (REFP0/REFN0), internal reference always on)
- IDACMUX-> 0fh (IDAC2 output - AIN0, IDAC1 output - disconnected)
- INPMUX-> 12h (positive ADC input - AIN1, negative ADC input - AIN2)
- Command START
My problem is trying to eliminate this 1% error in my readings.
Thank you for any feedback,
Eddie
Idealized schematic:
Simplified actual schematic:
(Please excuse this crudely, Paint-edited schematic!)
Register configurations: