This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

How to calculate accuracy achieveable ADS1247

Other Parts Discussed in Thread: ADS1247

I am using ADS1247 to measure voltage as listed in attached file.

Minimum voltage to measure = 59067uV

Maximum voltage to measure = 979671uV

AVDD = DVDD = 3.3V

IDAC = 50uA

PGA = 2

Internal refer = 2.048V

1. Will I achieve to measure voltage as I need?

2. How to calculate accuracy which can be achieved by ADS1247 theoretically with above parameters?

8182.Book1.xlsx

  • Hi Aamir,

    There are a couple of things to consider when determining accuracy.  Often times to get a high level of accuracy you will need to calibrate your system.  Delta-sigma dataconverters are precision converters which is often misunderstood as accurate.  Precision refers to repeatability, which means that I can measure the same voltage over and over again and get the same or nearly the same result (within the level of noise).

    Accuracy, on the other hand, refers to the result returned is what I expected based on the input to the ADC.  You might say 'what is the difference'?  Accuracy relates to how close to the target response I am getting.  This could be a very noisy distrubution where the mean is centered around the expected result.  Precision relates to the distribution itself.  A very tight distribution is precise.  A precise measurement could have a mean very different from the expected, which is often offset, but may include gain and other error factors which make the measurement less accurate.  The best situation is to have both precision and accuracy, and this usually requires calibration.

    To determine accuracy you need to consider not only accuracy of the ADS1247, but also the other components within your system.  When looking at the ADC portion, the best thing to do first is look at the precision to see if the ADC is even capable to measure the resolution that is required.  From the spreadsheet it appears you need to be able to resolve to around 90uV increments.  Although your specific case is not listed in the datasheet noise numbers we can get a general idea.  At a gain of 2 at 20sps with the internal reference, best case is 18 bits.  If you want to get 0.05 deg C resolution which is about 90uV steps, you will need about 13.5 bits.  This appears to be sufficient for this use case.

    When discussing accuracy and precision, we need to limit the noise and drift sources.  Instead of using the internal reference at a gain of 2, why not make the measurement ratiometric to limit the effects of noise and drift of the measurement.  You could use the biasing resistor required for placing your sensor output within the correct common mode range to also be used to establish the reference.  If you create the bias voltage at 1V, resulting in the reference of 1V, you should still be able to achieve the same result with less impact from noise and drift.

    Unadjusted error of the ADC will be determined by the RSS of inital accuracy of the IDAC source, the offset and gain error of the ADC, and the accuracy/drift of the bias/reference resistor, ADC INL and ADC noise.  Total unadjusted error would also include the accuracy of the sensor and the accuracy/drift of the linearization resistor.  A calibrated system will greatly reduce these errors.

    Best regards,
    Bob B