This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

ADS1220 delta sigma ADC 24bit , functionality question about conversion time

Genius 4170 points
Other Parts Discussed in Thread: ADS1220

Hello,

small question, I am sure someone knows the answer:

I am using the ADS1220 to read out an analog voltage. I have a rectengular kind of signal with about 60Hz, so I wanna measure the high signal ( about 9ms ) and after that the low signal ( also about 9 ms).

I am trying to find out how the sigma delta is working, I am used to SAR devices which got a certain sample and hold time to determine exact signals with less noise on them.

What is the equivalent on sigma delta devices?

I do read about the acutal modulator frquency, in my case this should be 256kHz ( normal mode). So my guess is, the actual sampling time is 265kHz or every 3,91 µs.

Next I can choose the sampling output rate from 20 Samples per second to 1000 samples per second.

What I assume, because the less samples i put out the more averaging ocures, since, well the sigma delta modulator has more values to average from, is that right?

So now comes the real question or my real issue on my design:

In case I want to measure the high signal: The voltage has a capacitive looking rise time, so it need lets say about 3ms to reach a stable high voltage, of lets say 1 V. It stays on this high plateau for another ( 9ms - 3ms rise time) about 6ms.

I start a one shot on the ADS1220
 right after i assume my signal is settled and stable ( since in SAR ADCS i kinda did the same ) , now I wanna know what happens next.

I do start a sample output of 175 samples per second, this yields to a conversion time ( according to datasheet internal 4MHz clock) of 23762 clock-cycles, which is around 6ms.

What does that conversion time exactly mean?

From my understanding in SAR this would be the time the internal sampled signal is converted into a digital signal, so in this time my input analog signal could do whatever it wants it wont change my digital outcome, since the signal changes just do not apply anymore to my internal sampled analog signal voltage.


Now I do assume in the delta sigma ADC this is quite different, but this tiny little very important fact is not mentioned anywhere. I assume in my 175 samples per second case, the conversion time of about 6ms means that in this particular time, i do take a sample each 256kHz ( normal mode) and at the end there is the FIR filter applied and I get an avaregaed output over exactly that time.

So If the input analog signal changes in that exact 6ms, my output will indeed differ as well, since it is still taking samples fomr the anaol input signal, is that correct or wrong?

Cause in my real signals world, if i start conversion time after 3ms and take 6ms to sample, it could well be, that i take samples into the time , where the low signal already applies to the analog input, but i do not wanna take samples of the low input voltage, so this implies an measurment error, correcT?


So the solution would be, to take more samples, in order to shorten the conversion time, right?

Thanks for reading.

cheers,

seb

  • Hi Seb,

    You cannot think of a delta-sigma the same way as you do a SAR converter.  Delta-sigma is an oversampling converter that samples the input at the modulator rate.  The word modulator is a key word as the output is a bit stream (where many delta-sigma converters are single bit 1's density bit stream) relative to a reference.  Long story short, the benefit is noise shaping where the noise is pushed into the higher frequencies, and then filtered by the digital filter that is employed by the ADC.  This could be a simple averaging filter, a SINC filter, a very complicated FIR filter to provide a flat passband, or a combination of filters.  The data rate is the output rate of the completed process.  A SAR converter takes a snapshot in time and provides a result for that specific period of time whereas the delta-sigma converter takes many samples of data and processes the result over a vastly larger period of time.

    For the ADS1220, the filter response is a fully digitally settled result in one output cycle.  However, any analog settling during that same time period will be reflected in the result which brings about a measurement error as you mentioned.  Also, to avoid this issue of analog settling (or analog voltage transition) you will need to increase the data rate (time becomes shorter for the result to become available).  You can fine tune this somewhat by applying an external clock.

    I suggest you take a look at the blogs posts that discuss delta-sigma converters:

    http://www.ti.com/hpa-pa-dsig-dsigpsseries-thehub-20150625-blog-dsbasics-en

    Best regards,

    Bob B

  • Ok that kinda tells me i am right, so in my case i should go to another samples per second rate, in order to solely measure the "stable" voltage and not run into voltage changes.

    I got some other issues with my system right now, mainly in the analog world, but tomorrow im going to test some ADC results.
    Tahnks for your thoughts.