How can TM4C1294 ADC channels be offset calibrated for 0V near VDDA. There is some datasheet discussion shorts can be detected but no mention VDDA offset calibration other TI MCU classes can do. It seems the idea is to adjust all channels to have 0v offset versus some being higher or lower dc level prior to sampling. Odd such a wide silicon design variance exist between 12 bit SAR ADC from the same company.
Seemingly there should be some way to set channel offsets equally to 0v for sequencer steps, perhaps via offsets being pre-loaded FIFO values? For instance two channel steps (same 2 sensor part #'s) require very different filter math to get ambient temperatures to calibrate equally. This seems directly related to analog channel FIFO values being widely offset from 0V 1LSB counts.