Hello,
small question, I am sure someone knows the answer:
I am using the ADS1220 to read out an analog voltage. I have a rectengular kind of signal with about 60Hz, so I wanna measure the high signal ( about 9ms ) and after that the low signal ( also about 9 ms).
I am trying to find out how the sigma delta is working, I am used to SAR devices which got a certain sample and hold time to determine exact signals with less noise on them.
What is the equivalent on sigma delta devices?
I do read about the acutal modulator frquency, in my case this should be 256kHz ( normal mode). So my guess is, the actual sampling time is 265kHz or every 3,91 µs.
Next I can choose the sampling output rate from 20 Samples per second to 1000 samples per second.
What I assume, because the less samples i put out the more averaging ocures, since, well the sigma delta modulator has more values to average from, is that right?
So now comes the real question or my real issue on my design:
In case I want to measure the high signal: The voltage has a capacitive looking rise time, so it need lets say about 3ms to reach a stable high voltage, of lets say 1 V. It stays on this high plateau for another ( 9ms - 3ms rise time) about 6ms.
I start a one shot on the ADS1220
right after i assume my signal is settled and stable ( since in SAR ADCS i kinda did the same ) , now I wanna know what happens next.
I do start a sample output of 175 samples per second, this yields to a conversion time ( according to datasheet internal 4MHz clock) of 23762 clock-cycles, which is around 6ms.
What does that conversion time exactly mean?
From my understanding in SAR this would be the time the internal sampled signal is converted into a digital signal, so in this time my input analog signal could do whatever it wants it wont change my digital outcome, since the signal changes just do not apply anymore to my internal sampled analog signal voltage.
Now I do assume in the delta sigma ADC this is quite different, but this tiny little very important fact is not mentioned anywhere. I assume in my 175 samples per second case, the conversion time of about 6ms means that in this particular time, i do take a sample each 256kHz ( normal mode) and at the end there is the FIR filter applied and I get an avaregaed output over exactly that time.
So If the input analog signal changes in that exact 6ms, my output will indeed differ as well, since it is still taking samples fomr the anaol input signal, is that correct or wrong?
Cause in my real signals world, if i start conversion time after 3ms and take 6ms to sample, it could well be, that i take samples into the time , where the low signal already applies to the analog input, but i do not wanna take samples of the low input voltage, so this implies an measurment error, correcT?
So the solution would be, to take more samples, in order to shorten the conversion time, right?
Thanks for reading.
cheers,
seb