For DC input, I see that the offset (when Vin + & -) are shorted) is specified at typical -0.1%FS (-60dBFS) but it could be as much as -0.9%FS (-41dBFS, ~6.5 bits ENOB).
Is this correct (effectively 6.5 ENOB worst case at DC)?
Do we have methods to calibrate this out other than calibration in the processor that monitors the ADC?