This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Why is ADS131E0x start settling time for 16bit modes based on 24bit sample times?

Hello!

Looking at the Table7 of the IC datasheet, I'm trying to figure out what's the reason behind settling time for 64ksps and 32ksps modes being, accordingly, 152 and 296 fCLK cycles.

I do understand there's, apparently, 4 tCLK delay to startup conversions and 4 tCLK delay to output conversion result into shift register. So we're left with 144 and 288 fCLK cycles.
fMOD = fCLK/2, so it leaves 72 and 144 fMOD cycles.
There is also 3 samples settling time of the sinc LPF, and there is decimation ratio (1 for 64ksps, 2 for 32ksps). So dividing by 3 and 6, I end up with 24 fMOD cycles per single sample. Which I find weird, because with DR of 1 and 2, the ADC works in 16 bit mode.

What's more, after the initial start delay, single sample is obviously taken every 16 fMOD cycles, or else it wouldn't manage to meet 64 and 32 ksps data rates.
The question arises - why is the start settling time based off 24 bit sampling times in 16 bit sampling modes?

The given calculations make perfect sense to me for the slower, 24 bit modes, where I also get 24 fMOD cycles per sample.

  • Hello Dominik -
    You might want to recheck Table 4 in the datasheet. OSR @ 64kSPS = 16, not 1.
    Oversampling converters use the OSR as the benefit of this architecture; so OSR = 1 doesn't provide the oversampling advantages.

    Based on OSR = 16:
    1/(settling time) = 4 x (fDR) = 4 x (fMOD/OSR) = 4 x (fCLK/2)/OSR) = > 128tCLK
    The difference between 128 and 152 can be accounted for by digital delays within the device.
  • Hey Greg,

    Thank you for your reply.

    I assumed that OSR should be equal to fMOD cycles per sample I was getting in the first post (I end up with 24 in every case, not 1), but I see my initial guess about equal digital delay in each of sampling modes might be wrong.

    So I'm trying understand the reason behind variable time difference between sampling speeds. I do understand device needs to react START signal and start operating, but shouldn't that time be constant between various sampling speeds? I find that strange, especially considering that - when working in continuous mode - there is no additional delay between taking consecutive samples in the 16bit modes - for example with 64ksps:
    fMOD / OSR = 1.024MHz / 16 = 64kHz

    For 64ksps there's a 24 tCLK difference, but for 32ksps there's a 40 tCLK difference already (I multiplied your settling time result by two). What's causing the additional 16 tCLK delay?

    I also cannot correctly relate the equation given by you with any slower conversions, using OSR = 24 and higher decimation ratios. Could you review it?