Other Parts Discussed in Thread: LMH3401, ADC12DJ5200RF
Tool/software:
Hello everyone,
we're using the ADC12DJ2700 in a DC-coupled application in single channel interleaved mode.
We're using foreground calibration mode with ADC-A and ADC-B cores. ADC-C is not used. The sampled input is INA.
In general, after performing a foreground calibration followed by an offset calibration, the device is working as expected.
As part of our application, it is possible that the signal source connected to INA imposes a DC offset near (but still within the valid range) one of the full range voltages of the ADC itself.
With the input voltage at this point, we can observe a difference of ~1-3LSB between the resulting 'baselines' from ADC-A and ADC-B respectively. This results in a larger standard deviation of the interleaved data stream when compared to the operating point where the offset calibration has been performed.
I suspect this might be caused by slighty different gain transfer functions of ADC-A and ADC-B.
The actual question is:
The datasheet SLVSEH9A mentions in Table 44 in the notes of section 'INA and INB gain' core-specific gain adjustment options (named GAIN_B0 and GAIN_B1 aswell as GAIN_B4 and GAIN_B5), but there are no further explanations or register bit details for the mentioned GAIN_Bx fields.
Do these fields exists? Is core-specific gain adjustment possible at all?
Best regards,
Thorsten

