Does 1LSB=0 - 820µV or just 820µV? If only 820µV what about the voltages between 2LSB to 3LSB? It would seem sample switches/caps could capture any defined analog voltage and had the entire Bit range (0-4096) to convert the analog voltage into a digital value anywhere between two 820µV steps.
So 2^12 = 4096*820µV=3.358V and 1LSB should = 0V? But +/-3LSB precision error % adds noise into the digital conversion results or 0v-2460µV?
Oddly it does not seem like VDDA is actually 0V and has a much higher offset value, even when tied to DGND plane. Conversion results <50mV are not possible and maintain <1/4 LSB precision if any channel noise drives an odd offset state from that of VDDA.