Dear TI Analog Experts,
For our new project we are planning to use ADS1262 ADC for precise signal conditioning of several analog channels from thermocouple to small signal vibration signals.
In order to evaluate ADS1262 we designed a breakout test board.
During testing, the ADC configuration was like this:
- PGA = 1
- 20 SPS
- FIR filter
- Continuous mode
- Internal Vref
While trying to evaluate ADC, we applied a differential analog signal with an offset of 2.5VDC to "AIN0" and "AIN1" respectively, "AIN3" and "AIN4" are configured as GPIOs.
In every trial we changed differential inputvoltage from 100mV to 1V and in each case ADC readout was less than the expected value with an amount of constant ~70mV.
The error is much bigger than the one which can be resulted by the internal errors of the ADC. Then, we have carried out several measurements and finally checked the internal reference voltage. The measured value from "REFOUT" pin was ~2.595V. Then, we power off/on the ADS1262 and we have seen "REFOUT" as 2.497V. Interestingly, while we read 2.497V at the first power-up every time, when we apply a hard-reset (through "RESET#/PWDN#"pin) to ADS1262 the measured value increases to 2.595V.
It is important to point out that, we are measured just two values: even 2.595V or 2.497V i.e. not any value between those two.
Could you have a look at our test board schematic and send your feedbacks.
Best regards
ilhantaygurt