Hi,
We are using ADS1248 for current input sensing.
Please find the attached circuit for the reference.
The input is applied to CH0_RTD-/TC-/I+ and CH0_V-/I-/RTD- terminals ( Shorting CH0_V+/I+ terminal to CH0_RTD-/TC-/I+).
When we apply 20mA input,
it will flow through the 124Ohm resistor gives a drop of 2.48V.
This voltage is applied to potential divider network (909K and 100K) and the output is around 248mV at the input of CD4052B mux and the same will be given to ADC.
We are setting PGA=8 for taking this close to internal reference. (internal reference is used in this case).
The ADC will give the count according to the (0-20mA) input applied.
After getting the count we are doing calibration to the count.
Calibration: Scaling the ADC raw count to 0 (0mA) to 64000 (20mA) count.
Problem:
-
When we are applying the input from 0 to 20mA in multiples of 4mA (0mA, 4mA, 8mA,…20mA). We are getting linear count.
-
When we switch the input from 0 to 20mA the voltage at the input is going above 248mV and taking some time (10 to 15 sec) to stable so the count is fluctuating?
-
In the circuit maximum RC delay is 1s (10mEG and 0.1MF). So to charge or discharge (with settling) the taken will be 5sec. Then why it is taking 10 to 15 sec of time to settle down. What could be the contribution factor?
-
When we are switching from 20mA to 0mA, the input at the ADC is going negative and taking same time (10 to 15sec) to settle down to zero.
-
Through the same channel we are applying the voltage (0 to 10V and -10 to 10V), but we have not observed any problem?
So could you please suggest us what could be the contribution factor for delay? (Whether it is ADC or the RC through which the voltage is not getting discharge/charge properly)?
Also please tell us How to WRITE the driver for this so that we check the driver at our END.