Because of the holidays, TI E2E™ design support forum responses will be delayed from Dec. 25 through Jan. 2. Thank you for your patience.

This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

ADS130E08: Internal Test Signal

Part Number: ADS130E08

Customer is developing driver SW to read data from ADS130E08.  Trying to use the internal test signal as a quick check of the SW readback and want to verify our understanding of what the results should look like.

With Vref = 2.4V, ADC count = 2.4/(2^15-1) = 73.24 uV.  PGA gain = 1. Test signal set to 1mV, DC output. From test signal register description, the test signal is actually -1mV.  Is this correct?

With an LSB size of 73.24uV, 1mv = 13.65 counts. So if test signal is -1mV, ADC result is -13 or -14 decimal, which would be FFF3h or FFF2h in 16-bit two's complement. Is this correct?

I assume the actual result read back could have a pretty good error as the test signal is small compared to the ADC range. Noise, offset, gain error would have a big impact with this test. Before moving to inputting an external signal with a larger known amplitude, we just want to make sure the math and expectations are correct for the test signal.