Other Parts Discussed in Thread: ADS127L18
Tool/software:
I hope whoever is reading this is doing well!
Using an ADS127L18EVM-PDK Evaluation Module. Set up using steps in the user guide (external dc source of 6V, .5 A). The DC source is a single signal, not a differential so the unused input signal pin (CH0_N) is grounded using wire and solder. The ADC tests showed off THD % even after being calibrated using offset and gain changes.
Specifically, for the offset the LSB was calculated first using = Full Scale Reference Voltage/ 2^bit #. Then the LSB offset = offset error V/ calculated LSB. Then I rounded to whole number and converted to binary and put into register CH0_OFFSET0_REG. This brought the offset from micro volts off to nano volts off. The numerical calculation: LSB= 2.5V/2^24 = 0.298 microVolts. -> Offset error = 8.75 microVolts/0.298 microVolts = 29. Then convert to binary -> 11101 and inputted into the least significant register offset.
For the gain calibration, there was only a 2.5V reference voltage available, so to avoid clipping a 1.01V from a sig gen was used to ungrounded pin. The expected bit output = (1.01V/2.5V)(16777215)=6777995 and the actual bit output = (1.023068V/2.5V)(16777215)= 6865693. Then finding the correct calibration value = Expected/actual * 400000h = (6777995/6865693)(400000h) = 001111110010111010111001 and that was input across the three gain correction registers for channel 0.
After all these changes the harmonic distortion is still way above what it should be based on the data sheet and harmonics of the base frequency are very pronounced which only shows up on this ADC and not our oscilloscope. The inputted signal was a sine wave at 1 kHz with Vpp of 2.44 V and and an offset of 0.625 V. If you could please give any insight into if the calibration could be introducing harmonics or if there is something else up, it would be much appreciated!