Hello Every one,
I am working on RF Signal processing Using Delfino TMS320F28335 EVM. Input signals from VI sensor are given to ADC input pin. For a signal less than 0.3v the adc output is equivalent to the input . while the signal goes above 0.3v the output shows much difference with the input. I have tried with different sampling rate, but the problem remains the same. I have tried oversampling and averaging too.
My configuration is as follows:
void InitAdcs(void)// called from the main function
{
AdcRegs.ADCTRL1.bit.RESET = 1;
asm(" RPT #22 || NOP");
InitAdc(); ///calibration function calling
AdcRegs.ADCTRL3.all = 0x00E3;
DELAY_US(ADC_usDELAY);
AdcRegs.ADCMAXCONV.all = 0x0003
AdcRegs.ADCTRL3.bit.SMODE_SEL = 1;
AdcRegs.ADCCHSELSEQ1.bit.CONV00 = 0x4; // ADCINA4 and ADCINB4
AdcRegs.ADCCHSELSEQ1.bit.CONV01 = 0x5;
AdcRegs.ADCCHSELSEQ1.bit.CONV02 = 0x6;
AdcRegs.ADCCHSELSEQ1.bit.CONV03 = 0x7; // ADCINA7 : ADCINB7
AdcRegs.ADCTRL1.all = 0x0f50;
AdcRegs.ADCTRL2.all = 0x2000;
}
void Adc_Init(void)// called from the main function
{
AdcRegs.ADCTRL2.bit.SOC_SEQ1 = 1; // Enable SEQ1 interrupt (every EOS)
AdcRegs.ADCTRL2.bit.INT_ENA_SEQ1 = 1; // Enable SEQ1 interrupt (every EOS)
Adc_Start();
}
void Adc_Start(void)
{
adc_EndConv = 0;
adc_count = 0;
adc_avg = ADCAVG;// ADCAVG = =64
ADCSum.Voltage1 = 0;
AdcRegs.ADCST.bit.INT_SEQ1_CLR = 1;
}
void GetADCData(void)
{
Voltage1 = (Temp_Voltage1>> 6) // // called from the main function
}
interrupt void ADCint_ISR(void)
{
if (adc_count <=adc_avg ) {
Temp_Voltage1 += (AdcRegs.ADCRESULT0 >>4); //
adc_count++;
}
AdcRegs.ADCST.bit.INT_SEQ1_CLR = 1;
}
The output reading of voltage1 for an input value of 0.412V is 324(~0.23v), for input value of 0.566 it is 409(~0.3v). As the input voltage increases, the difference increases proportionally.
Any one can help is kindly appreciated.