Hi guys
The standard method for measuring the noise free resolution of an ADC is to ground the input and then plot a histogram of the output codes to find the standard deviation.
What I have found is that if I use a different DC input voltage, say VREFH/2, the measured standard deviation increases roughly linearly with the applied DC voltage.
Can anyone help explain why the count variance increases as the DC input is increased?
I have observed this effect on the SAR input of an MSP430, but also on a Sigma Delta converter.
Regards
Bob