Hello,
We are using the ADS1262 for high precision measurement equipment. We need exceptional DC precision. A constant ADC offset voltage is not a big problem, but we have to keep offset drift very low over hours and days.
Our current design includes the following input filter:
The main purpose of the 1k resistors is current limitation when for whatever reason the input signals exceed supply voltage. Now, my question is how those resistors affect measurement performance. Both absolute and differential input current (PGA disabled) is specified as typ. 150nA, resulting in 150µV voltage drop. I assume the input current is temperature-dependent and contributes to the overall temperature drift of the circuit. Would the elimination of those resistors bring a significant benefit in DC precision? For example, I could bypass them using relays which are fast enough to open under all realistic overvoltage scenarios in our application.
I know that I could use the PGA to reduce input currents, but one of the input signals is always close to GND, so I can not meet the PGA's absolute input voltage requirements without greatly complicating the whole design.
Is there any other suitable way to improve DC performance? Would the chop mode eliminate the voltage drop caused by input currents as well? Does the chop mode alter input impedance or input currents in any way?