Other Parts Discussed in Thread: CONTROLSUITE, LM4132, OPA320, OPA350, OPA365, LMC6482
Hi,
I am using F28377D silicon revision C on my own custom PCB. I have configured to sample 10 Analog inputs which are distributed to the 3 ADCs (4+3+3). Each Analog input is given through a RC filter to the controller pins,(R - 47 ohms, C - 33pf). while testing, i've shorted all the inputs and given variable voltage by using a Battery and resistive divider network.The ADC clock is configured for 25 MHz and the acquisition window is set to 400 ns in 12 bit single ended mode.Following are the observations.
1.When all the 10 inputs are sampled continuously (using the example software in Controlsuite) , the average value of the results is dropping as the voltage is increased to the full scale 3.3V, and noise of around 150 counts over the average is observed. when a single input or group of inputs given to a specific ADC are sampled, then the noise counts are reduced to 20.
2.when the group of inputs given to two ADCs are sampled, the noise increased to 150 counts.
3.Then the SOCs are given using Timer1 @16khz frequency for all the ADCs, then the noise counts are reduced to 20,but as in the first case the average value of the results is dropping as the voltage is increased to the full scale 3.3V.when a single input is sampled, the average value is close to the actual input given.
This behaviour is not improved when i change ADC clock and/or acquisition window.
Please help me to understand what is happening and suggest whether any proper configurations to be done for ADC.
