This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TMS320F28069F: Why does the CPU Timer interrupt causes ADCINT flag to get stuck?

Part Number: TMS320F28069F

Tool/software:

Hello,

I want to have a custom system timer ticking before I run any initializations in my code, so that I could timeout of loops to avoid halting the processor for safety purposes. As the ADC routine waits for some flags to settle during initializations, I want to have this timer running prior to initializing the ADC itself.

I've configured CPU Timer 1 to run as a tick timer every 1us. The interrupt function associated with CPU Timer 1 interrupts every 1us. Following that, I attempt to initialize the ADC after enabling the CPU Timer interrupt, however the AdcRegs.ADCINTFLG.bit.ADCINT1 interrupt bit does not set to 1 meaning that my ADC initialization routine gets stuck. (i.e. while (AdcRegs.ADCINTFLG.bit.ADCINT1 == 0); )

My observation:

The exact part of the code where the ADC gets stuck is during offset calibration. This is definitely caused by the CPU TIMER interrupt because when I disable the timer, the "AdcRegs.ADCINTFLG.bit.ADCINT1" flag sets to  1. 

However, when I add an exit condition to that loop it appears that the ADC initializes without a problem. After finishing the ADC setup I start the ADC conversion, whilst my timer interrupts in the background, and I'm able to extract the voltage value without getting stuck in this loop "while (AdcRegs.ADCINTFLG.bit.ADCINT1 == 0);"

What's happening here? How do I get stuck in that same loop during offset calibration but not when I'm converting the ADC? The Timer still runs in the background....

Additionally, I've noticed that when the Timer is off and I hit a breakpoint within the adc_calibration() function the same behavior occurs. This makes me think that the CPU TIMER interrupt function adds a significant amount of delay during calibration which disrupts the initialization. Is there a work around this?

My main function:

Fullscreen
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
int main(void)
{
memcpy(&RamfuncsRunStart, &RamfuncsLoadStart, (Uint16)&RamfuncsLoadSize);
InitFlash();
InitSysCtrl();
DINT;
InitPieCtrl();
IER = 0x0000;
IFR = 0x0000;
InitPieVectTable();
EALLOW;
PieVectTable.TINT1 = &cpu_timer1_isr;
EDIS;
timer_service_init();
IER |= M_INT13; // Enable CPU Interrupt 11
EINT; // Enable Global interrupt INTM
ERTM; // Enable Global realtime interrupt DBGM
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

ADC get's stuck in the offset calibration routine:

Fullscreen
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
static Uint16 adc_conversion(void)
{
Uint16 index, SampleSize, Mean, ACQPS_Value;
Uint32 Sum;
index = 0; // initialize index to 0
SampleSize = 256; // (**NOTE: Sample size must be multiples of 2^x where is an integer >= 4)
Sum = 0; // set sum to 0
Mean = 999; // initialize mean to known value
//
// Set the ADC sample window to the desired value
// (Sample window = ACQPS + 1)
ACQPS_Value = 6;
AdcRegs.ADCSOC0CTL.bit.ACQPS = ACQPS_Value;
AdcRegs.ADCSOC1CTL.bit.ACQPS = ACQPS_Value;
AdcRegs.ADCSOC2CTL.bit.ACQPS = ACQPS_Value;
AdcRegs.ADCSOC3CTL.bit.ACQPS = ACQPS_Value;
AdcRegs.ADCSOC4CTL.bit.ACQPS = ACQPS_Value;
AdcRegs.ADCSOC5CTL.bit.ACQPS = ACQPS_Value;
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

ADC Conversion function:

Fullscreen
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
float get_28_voltage(void)
{
float voltage;
Uint16 adc_div = 4096;
AdcRegs.ADCSOCFRC1.all = 0x01;
while(AdcRegs.ADCINTFLG.bit.ADCINT1 == 0)
{
}
AdcRegs.ADCINTFLGCLR.bit.ADCINT1 = 1; // Clear ADCINT1
if( 0 != adc_div)
{
voltage = 3.3 * (AdcResult.ADCRESULT0)/adc_div;
}
else
{
voltage = 0;
}
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX