This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Issue with triggering 12-bit ADC of MSP430F5529

Other Parts Discussed in Thread: MSP430F5529

Hello All,

I want to trigger the 12-bit ADC (repeat single channel mode) using the timer of MSP430F5529. Timer is programmed to generate interrupt every 1.9 ms (it is working fine, verified with toggling of Red LED using oscilloscope) and should start the ADC conversion in its ISR. Once the conversion is done, ADC ISR should toggle the Green LED . In the debugging mode I can see that it is not entering in the ADC ISR routine . I am not able to locate the error. The program works fine without timer interrupt and configuring the ADC accordingly. It would be great if someone can pin point the mistake.

void setup()
{

pinMode(P1_0, OUTPUT);  // Red LED

pinMode(P8_1, OUTPUT);  //Green LED

WDTCTL = WDTPW + WDTHOLD;    // Stop WDT

//for ADC initialization
P6SEL |= 0x01;          // Enable A/D channel A0 p6.0
REFCTL0 &= ~REFMSTR; // Reset REFMSTR to hand over control to ADC12_A ref control registers
ADC12CTL0 = ADC12ON+ADC12REFON+ADC12SHT0_4+ADC12REF2_5V;  //on+64 clock cycle

ADC12CTL1 =ADC12SHP+ADC12CONSEQ_2+ADC12SSEL_1;  // TRIGGER FROM timer a , repeat single channel,ACLK for ADC
ADC12CTL1 =ADC12SHS_1;       //timer source for sample and hold source select

ADC12IE = 0x01;         //enable interrupt
for ( i=0; i<0x30; i++);     // Delay for reference start-up to settle generator

// for Timer interrupt 1.9 ms


TA0CCTL0 = CCIE;                  // CCR0 interrupt enabled
TA0CCR0= 63 ;                     //interrupting every 1.9ms
TA0CTL = TASSEL_1 + MC_1 + TACLR;                   // ACLK, upmode, clear TAR
__enable_interrupt();                                       // Enable interrupts.

__bis_SR_register(LPM0_bits + GIE);

}

void loop() { }


#pragma vector=TIMER0_A0_VECTOR
__interrupt void TIMER0_A0_ISR (void)
{
ADC12CTL0 |= ADC12ENC ;             //Starts ADC every 1.9ms
P1OUT ^= 0x01;                                   // Toggle P1.0 and P1.6 using exclusive-OR

}


#pragma vector=ADC12_VECTOR
__interrupt void ADC12ISR (void)
{

ADC12CTL0 &= ~ADC12ENC;
results = ADC12MEM0 & 0x0FFF;    //lower 12 bit result
digitalWrite(P8_1, !digitalRead(P8_1));   //toggle LED
}

  • It’s not the interrupt that triggers the timer. The interrupt is for the CPU only.
    You need to define an output mode other than OUTMOD_0, like when generating a PWM output. This signal’s edge triggers the ADC (and when not setting ADC12SHP, its duty cycle will even control the sampling time)

  • Thanks for the reply. I have included the following lines and now its working fine:

             TA0CCTL0 = CCIE;                    // Compare mode interrupt enabled

              TA0CCR0= 63;                       //Sampling period cycle 

              TA0CCTL1 = OUTMOD_7;                       // TACCR1

              TA0CCR1 = 1;                             //TACCR1 OUT1 

              TA0CTL = TASSEL_1 + MC_1 + TACLR;         // ACLK, upmode, clear TAR

    But I still have the confusion about the selection of OUTMOD and TA0CCR1. As if know, I selected MOD 7 in which  o/p is reset when count reach TA0CCR1 and set otherwise. Now with the TA0CCR1=1 , I can see the LED toggling at every 1.9ms and if  TA0CCR1=62  LED toggles at every 3.9 ms. Does this mean that ADC got triggered at 1.9ms and 3.9ms respectively? Please confirm. I might sound insane but it would be helpful for me to understand. Another thing , if TA0CCR1 count is only deciding the ADC triggering time,  does the count in TA0CCR0 also matters?

  • OUTMOD_7 is reset/set mode. That means, the OUT signal is reset when TAR counts to TACCR1 and set when TAR counts to 0 (so not if you set it to 0 by TACLR or TAR=0).
    The timer is triggered when OUT gets set (so when TAR counts to 0) provided that it was clear before (edge triggered).
    When you keep TACCR0 constant and the timer in up mode, the LED toggle frequency shouldn’t change. Only the moment relative to the moment TAR rolls over to 0 will change.
    However, when you reduce the value in TACCR1, you might miss a trigger (as the old trigger point hasn’t been reached yet but the new has already been passed, so you add a complete cycle until the interrupt triggers)
    It’s also possible that for certain values, other ISRs will interfere (e.g. the ADC ISR), so you might see a timing different from what you expect.
    E.g. if the ADC is triggered on TAR=0 due to AOUTMOD_7, then it might be done and called the ADC ISR when the TAR reaches TACCR1. So the TACCR1 ISR call is delayed until the ADC ISR is done. The OUT signal has changed at the right moment still. So if you attach your LED at the TA.1 port pin, you’ll see the ‘real’ signal and not what your software-toggled LED in the timer ISR shows.

    If you don’t set the SHP bit, the ADC will start sampling when the OUT bit gets set, but end sampling and start the conversion when the OUT bit is cleared. If the delay between clear and set is too small (so the conversion is not done yet), the ADC will ignore the trigger and skip a cycle. That’s probably what you observe with the LED toggle in the ADC ISR.

    Also, I just noticed that in your code, you assign twice a value to ADC12CTL1. Once you set the ADC12SHP, ADC12CONSEQ_2 and ADC12SSEL_1 value, then you overwrite them by assigning ADC12SHS_1 (clearing ADC12SHP, setting single conversion mode and switching the clock back to MODOSC). So this fits my explanation above.
    Note that for the ADC12 on most MSPs, a certain minimum and maximum clock speed is required. So you can’t run the ADC form ACLK when ACLK runs on the default REFO 32kHz.

**Attention** This is a public forum