This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

ADS1256: Oversampling interval

Part Number: ADS1256

Hello,

This is not regarding the ADS1256 specifically, but a general ADC question:

When oversampling for increased resolution, how important is it to have an exactly identical interval between the samples?

Best regards

Niclas

  • Hi Niclas,

    A delta-sigma ADC like ADS1256 automatically oversamples the input per the datasheet specs, so you do not need to do this manually.

    If you wanted to perform additional oversampling in your MCU, then that is also possible. The time interval between samples should not matter as much, as long as you are meeting the timing specs in the ADC's datasheet. Usually with oversampling you are trying to take a lot of samples very quickly, so it is usually sampling rate that matters, especially for AC signals that can vary rapidly with time. For DC signals that are slow-moving e.g. temperature, this is less important.

    -Bryan

  • Hi Bryan,

    Thanks for a good answer. I now understand that I should have asked a slightly different question:

    I am converting an old application based on the ADS1256 to a modern MCU, which means a lot of new and rewritten code. The new application is still based on the ADS1256. Part of this work is understanding what the previous programmer did and why.

    It seems that consistently a sample rate of 1000 samples per second (SPS) is used (DRATE 0xA1). During calibration, 20 or 80 subsequent measurements are averaged. My initial question was actually an attempt to understand if these subsequent measurements should be separated by an identical interval. Judging by your reply, the answer to that is "not really".

    Now I wonder how running the ADS1256 once at 50 SPS compares to running it 20 times (average) at 1000 SPS. Should these two methods give approximately the same resolution? (If I understand correctly, this is also a question of frequency response, buffer input impedance, and the time it takes the ADC1256 to produce a measurement result, which might be important for the rest of the application.)

    Best regards

    Niclas

  • Hi Niclas,

    In general, I would recommend common spacing between commands or groups of commands - my point was that this is not absolutely necessary. I have seen some engineers use what appears to be random spacing, which makes reading data and debugging more challenging. But it will still work, as long as the timing requirements are met.

    A delta-sigma ADC uses a technique called noise shaping to improve the noise performance of the device, by shifting it to higher frequencies that are then removed by the digital filter. Oversampling will reduce noise by the square root of the number of averages, but will not benefit from the noise shaping technique. If you look at the noise in Table 1, it is 629nVRMS at G = 1, ODR = 50 SPS, and 2,931 at G =1, 1000SPS. Since you can expect your noise to reduce by the square root of the number of averages, you can expect a noise reduction of √20 = 4.47. So your noise could theoretically reduce to 2,931 / 4.47 = 656nVRMS, which is still higher than the ADC noise.

    If your system can afford to output data at a lower rate, it might make more sense to let the ADC perform this function. If there are some operations being applied to the data before averaging e.g. removing the highest and lowest value, etc., then this should still be done by the MCU.

    -Bryan

  • Thanks Bryan,

    Another high-quality reply. Much appreciated.

    Best regards

    Niclas