Hello,
I am looking for a more detailed explanation of what occurs during the calibration process of the ADC08d1020.
Looking at the datasheet it looks like the analog input differential termination resistor is adjusted to minimize full-scale error, offset error, DNL and INL, resulting in maximizing SNR, THD, SINAD (SNDR) and ENOB.
Is this resistor adjustment the extent of the calibration for this device? Are their any other operational parameters that are adjusted or key parameters that are not adjusted during this process?
I appreciate your feedback.
Ryan May