This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
Tool/software:
Hello,
I want to have a custom system timer ticking before I run any initializations in my code, so that I could timeout of loops to avoid halting the processor for safety purposes. As the ADC routine waits for some flags to settle during initializations, I want to have this timer running prior to initializing the ADC itself.
I've configured CPU Timer 1 to run as a tick timer every 1us. The interrupt function associated with CPU Timer 1 interrupts every 1us. Following that, I attempt to initialize the ADC after enabling the CPU Timer interrupt, however the AdcRegs.ADCINTFLG.bit.ADCINT1 interrupt bit does not set to 1 meaning that my ADC initialization routine gets stuck. (i.e. while (AdcRegs.ADCINTFLG.bit.ADCINT1 == 0); )
My observation:
The exact part of the code where the ADC gets stuck is during offset calibration. This is definitely caused by the CPU TIMER interrupt because when I disable the timer, the "AdcRegs.ADCINTFLG.bit.ADCINT1" flag sets to 1.
However, when I add an exit condition to that loop it appears that the ADC initializes without a problem. After finishing the ADC setup I start the ADC conversion, whilst my timer interrupts in the background, and I'm able to extract the voltage value without getting stuck in this loop "while (AdcRegs.ADCINTFLG.bit.ADCINT1 == 0);"
What's happening here? How do I get stuck in that same loop during offset calibration but not when I'm converting the ADC? The Timer still runs in the background....
Additionally, I've noticed that when the Timer is off and I hit a breakpoint within the adc_calibration() function the same behavior occurs. This makes me think that the CPU TIMER interrupt function adds a significant amount of delay during calibration which disrupts the initialization. Is there a work around this?
My main function:
int main(void) { memcpy(&RamfuncsRunStart, &RamfuncsLoadStart, (Uint16)&RamfuncsLoadSize); InitFlash(); InitSysCtrl(); DINT; InitPieCtrl(); IER = 0x0000; IFR = 0x0000; InitPieVectTable(); EALLOW; PieVectTable.TINT1 = &cpu_timer1_isr; EDIS; timer_service_init(); IER |= M_INT13; // Enable CPU Interrupt 11 EINT; // Enable Global interrupt INTM ERTM; // Enable Global realtime interrupt DBGM adc_init(); for(;;) { } }
ADC get's stuck in the offset calibration routine:
static Uint16 adc_conversion(void) { Uint16 index, SampleSize, Mean, ACQPS_Value; Uint32 Sum; index = 0; // initialize index to 0 SampleSize = 256; // (**NOTE: Sample size must be multiples of 2^x where is an integer >= 4) Sum = 0; // set sum to 0 Mean = 999; // initialize mean to known value // // Set the ADC sample window to the desired value // (Sample window = ACQPS + 1) ACQPS_Value = 6; AdcRegs.ADCSOC0CTL.bit.ACQPS = ACQPS_Value; AdcRegs.ADCSOC1CTL.bit.ACQPS = ACQPS_Value; AdcRegs.ADCSOC2CTL.bit.ACQPS = ACQPS_Value; AdcRegs.ADCSOC3CTL.bit.ACQPS = ACQPS_Value; AdcRegs.ADCSOC4CTL.bit.ACQPS = ACQPS_Value; AdcRegs.ADCSOC5CTL.bit.ACQPS = ACQPS_Value; AdcRegs.ADCSOC6CTL.bit.ACQPS = ACQPS_Value; AdcRegs.ADCSOC7CTL.bit.ACQPS = ACQPS_Value; AdcRegs.ADCSOC8CTL.bit.ACQPS = ACQPS_Value; AdcRegs.ADCSOC9CTL.bit.ACQPS = ACQPS_Value; AdcRegs.ADCSOC10CTL.bit.ACQPS = ACQPS_Value; AdcRegs.ADCSOC11CTL.bit.ACQPS = ACQPS_Value; AdcRegs.ADCSOC12CTL.bit.ACQPS = ACQPS_Value; AdcRegs.ADCSOC13CTL.bit.ACQPS = ACQPS_Value; AdcRegs.ADCSOC14CTL.bit.ACQPS = ACQPS_Value; AdcRegs.ADCSOC15CTL.bit.ACQPS = ACQPS_Value; // // Enable ping-pong sampling AdcRegs.INTSEL1N2.bit.INT1E = 1; // Enabled ADCINT1 AdcRegs.INTSEL1N2.bit.INT2E = 1; // Enable ADCINT2 /** Disable Continuous Sampling for ADCINT1 & ADCINT2 */ AdcRegs.INTSEL1N2.bit.INT1CONT = 0; AdcRegs.INTSEL1N2.bit.INT2CONT = 0; /** ADCINTs trigger at end of conversion */ AdcRegs.ADCCTL1.bit.INTPULSEPOS = 1; /** Setup ADCINT1 and ADCINT2 trigger source */ AdcRegs.INTSEL1N2.bit.INT1SEL = 6; // EOC6 triggers ADCINT1 AdcRegs.INTSEL1N2.bit.INT2SEL = 14; // EOC14 triggers ADCINT2 /** Setup each SOC's ADCINT trigger source */ AdcRegs.ADCINTSOCSEL1.bit.SOC0 = 2; // ADCINT2 starts SOC0-7 AdcRegs.ADCINTSOCSEL1.bit.SOC1 = 2; AdcRegs.ADCINTSOCSEL1.bit.SOC2 = 2; AdcRegs.ADCINTSOCSEL1.bit.SOC3 = 2; AdcRegs.ADCINTSOCSEL1.bit.SOC4 = 2; AdcRegs.ADCINTSOCSEL1.bit.SOC5 = 2; AdcRegs.ADCINTSOCSEL1.bit.SOC6 = 2; AdcRegs.ADCINTSOCSEL1.bit.SOC7 = 2; AdcRegs.ADCINTSOCSEL2.bit.SOC8 = 1; // ADCINT1 starts SOC8-15 AdcRegs.ADCINTSOCSEL2.bit.SOC9 = 1; AdcRegs.ADCINTSOCSEL2.bit.SOC10 = 1; AdcRegs.ADCINTSOCSEL2.bit.SOC11 = 1; AdcRegs.ADCINTSOCSEL2.bit.SOC12 = 1; AdcRegs.ADCINTSOCSEL2.bit.SOC13 = 1; AdcRegs.ADCINTSOCSEL2.bit.SOC14 = 1; AdcRegs.ADCINTSOCSEL2.bit.SOC15 = 1; for(int i=0; i<5000;i++); //delay /** Force Start SOC0-7 to begin ping-pong sampling */ AdcRegs.ADCSOCFRC1.all = 0x00FF; for(index = 0; index < SampleSize; index+=16) { while (AdcRegs.ADCINTFLG.bit.ADCINT1 == 0) { } AdcRegs.ADCINTFLGCLR.bit.ADCINT1 = 1; // Must clear ADCINT1 flag since INT1CONT = 0 Sum += AdcResult.ADCRESULT0; Sum += AdcResult.ADCRESULT1; Sum += AdcResult.ADCRESULT2; Sum += AdcResult.ADCRESULT3; Sum += AdcResult.ADCRESULT4; Sum += AdcResult.ADCRESULT5; Sum += AdcResult.ADCRESULT6; // // Wait for SOC9 conversion to start, which gives for SOC7 // conversion result // while( AdcRegs.ADCSOCFLG1.bit.SOC9 == 1) { } Sum += AdcResult.ADCRESULT7; // // Wait for ADCINT2 to trigger, then add ADCRESULT8-15 registers to sum // while (AdcRegs.ADCINTFLG.bit.ADCINT2 == 0) { } /// Must clear ADCINT2 flag since INT2CONT = 0 AdcRegs.ADCINTFLGCLR.bit.ADCINT2 = 1; Sum += AdcResult.ADCRESULT8; Sum += AdcResult.ADCRESULT9; Sum += AdcResult.ADCRESULT10; Sum += AdcResult.ADCRESULT11; Sum += AdcResult.ADCRESULT12; Sum += AdcResult.ADCRESULT13; Sum += AdcResult.ADCRESULT14; // // Wait for SOC1 conversion to start, which gives time for // SOC15 conversion result // while( AdcRegs.ADCSOCFLG1.bit.SOC1 == 1) { } Sum += AdcResult.ADCRESULT15; } // // Disable ADCINT1 and ADCINT2 to STOP the ping-pong sampling // AdcRegs.INTSEL1N2.bit.INT1E = 0; AdcRegs.INTSEL1N2.bit.INT2E = 0; // // Wait for any pending SOCs to complete // while(AdcRegs.ADCSOCFLG1.all != 0) { } // Clear any pending interrupts // AdcRegs.ADCINTFLGCLR.bit.ADCINT1 = 1; AdcRegs.ADCINTFLGCLR.bit.ADCINT2 = 1; AdcRegs.ADCINTOVFCLR.bit.ADCINT1 = 1; AdcRegs.ADCINTOVFCLR.bit.ADCINT2 = 1; // // reset RR pointer to 32, so that next SOC is SOC0 // AdcRegs.SOCPRICTL.bit.SOCPRIORITY = 1; while( AdcRegs.SOCPRICTL.bit.SOCPRIORITY != 1) { } AdcRegs.SOCPRICTL.bit.SOCPRIORITY = 0; while( AdcRegs.SOCPRICTL.bit.SOCPRIORITY != 0) { } if ( 0 != SampleSize) { Mean = Sum / SampleSize; // Calculate average ADC sample value } else { Mean = 0; } return Mean; // return the average }
ADC Conversion function:
float get_28_voltage(void) { float voltage; Uint16 adc_div = 4096; AdcRegs.ADCSOCFRC1.all = 0x01; while(AdcRegs.ADCINTFLG.bit.ADCINT1 == 0) { } AdcRegs.ADCINTFLGCLR.bit.ADCINT1 = 1; // Clear ADCINT1 if( 0 != adc_div) { voltage = 3.3 * (AdcResult.ADCRESULT0)/adc_div; } else { voltage = 0; } return voltage }
Hi Valeri,
I will look into this and get back to you with a response tomorrow.
Best Regards,
Delaney
Edit: Issue Resolved. Setting the ADC sample window to a higher value appears to fix the issue. Interestingly, I observed that when I removed the timeout conditions in the while loops the ADC seems to initialise without a problem.
Hi Valeri,
Glad to hear the issue was resolved. I will close this thread, but feel free to make another if you have any other questions or issues.
Best Regards,
Delaney