This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

[FAQ] ADC Noise: What is the best noise parameter to use for system noise analysis?

What is the best noise parameter to use for system noise analysis when using Delta-Sigma ADCs?

  • For ADC noise analysis, we recommend using input-referred noise. We’ve bolded this phrase because it’s not common practice to use input-referred noise to define ADC performance. In fact, a majority of engineers speak exclusively in terms of relative parameters such as effective and noise-free resolution and are deeply concerned when they cannot maximize those values. After all, if you need to use a 24-bit ADC to achieve a 16-bit effective resolution, it feels like you’re paying for ADC performance you won’t actually use.

    However, an effective resolution of 16 bits doesn’t necessarily tell you anything about how much of the full-scale range (FSR) your ADC will use. You may only need 16 bits of effective resolution, but if the minimum input signal is 50 nV, you will never be able to resolve that with a 16-bit ADC. Therefore, the true benefit of a high-resolution delta-sigma ADC is the low levels of input-referred noise it offers. It does not mean that effective resolution is unimportant – just that it is not the best way to parameterize a system.

    Part 3 takes these claims one step further with a design example that uses both noise-free resolution and input-referred noise to define a system noise parameter. Which one enables the quickest, most adaptable solution? Read the article to discover the answer.