Hi,
I have an application using the ADS 1217 that measures +/- 1.0 Volt with a single read command at 5 samples / sec.
Input circuit provides a 1.25V offset, voltage swing at ADC is from .25V to 2.25V, PGA gain is 1, reference is 1.25V, desimation register is 640 decimal.
Runs well Except it occasioanly produces a +/- 65000 count (16 bit) error. The error seems to only occur at specific input values.
With 0V input (around 1.25 at ADC), the ADC output is 0x7C0xx (bottom bits are noisy). When the error hits I get samples of 0x7B0xx or 0x7D0xx
If I raise the input by a 1 mv the error goes away. If I raise the input by 8 mV (should give 0x7D0xx) the error comes back The 65K count error is eqiuvalent to about 8 mV at the input.
Thanks
Jim