I am using a TMS320F2811 to sample two channel analog signals, from ADCInA0 and ADCInB0. The ADC setting is as the following:
int ConfigADC()
{
//Reset
AdcRegs.ADCTRL1.bit.RESET = 1;
//SysCtrlRegs.PCLKCR.bit.ADCENCLK = 1;
DELAY_US(10); // Delay before powering up rest of ADC
AdcRegs.ADCTRL1.all = 0; //continue run, two separate SEQs
// Configure ADC
AdcRegs.ADCTRL3.bit.SMODE_SEL = 0x1; //simultaneous sampling mode
AdcRegs.ADCMAXCONV.all = 0x0000;
//will be increased later
AdcRegs.ADCCHSELSEQ1.bit.CONV00 = 0;
AdcRegs.ADCTRL1.bit.CONT_RUN = 1; // Setup continuous run
AdcRegs.ADCTRL1.bit.CPS = 0; //update here
AdcRegs.ADCTRL3.bit.ADCCLKPS = 1; //Change sample rate
AdcRegs.ADCTRL1.bit.ACQ_PS = 0; //No SOC--extra aquisition
AdcRegs.ADCTRL3.bit.ADCBGRFDN = 0x3; // Power up bandgap/reference circuitry
AdcRegs.ADCTRL3.bit.ADCPWDN = 1; // Power up rest of ADC
DELAY_US(10000); // Delay before powering up rest of ADC
return OK;
}
While I change core clock divider, AdcRegs.ADCTRL3.bit.ADCCLKPS, to different values, I observe that the average of the sampled signal for each channel changes as well.
The average is calculated as the summation of all sampled signals divided by he number of sampled signals for each channel. When the sampling rate is higher, the average is larger. When the sampling rate lower, the average smaller.
Though I have some knowledge of ADC sampling but I don't understand why the average changes with the sampling rate.
If you need more information, please let me know.
Thanks,