Customer is developing driver SW to read data from ADS130E08. Trying to use the internal test signal as a quick check of the SW readback and want to verify our understanding of what the results should look like.
With Vref = 2.4V, ADC count = 2.4/(2^15-1) = 73.24 uV. PGA gain = 1. Test signal set to 1mV, DC output. From test signal register description, the test signal is actually -1mV. Is this correct?
With an LSB size of 73.24uV, 1mv = 13.65 counts. So if test signal is -1mV, ADC result is -13 or -14 decimal, which would be FFF3h or FFF2h in 16-bit two's complement. Is this correct?
I assume the actual result read back could have a pretty good error as the test signal is small compared to the ADC range. Noise, offset, gain error would have a big impact with this test. Before moving to inputting an external signal with a larger known amplitude, we just want to make sure the math and expectations are correct for the test signal.