This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

MSP430FR6989: Timer Using DriverLib Module

Part Number: MSP430FR6989



I have implemented a timer that counts down milliseconds using the TimerA module. I used examples in Driverlib as well as ones in the TI Forums as reference.  When I tested it using a stopwatch with the MSP430FR6989 in debug mode, the times seemed accurate. However I realized that I needed a more accurate method of testing the timer. Therefore I tested the timer by counting the number of CPU cycles from CCS during debug mode. However, this showed that the timer was off by a factor of 20 assuming the CPU frequency was 16MHz.

The only way I can see my timer implementation being correct is if the CPU frequency is at 800 kHz, which I am not sure about after reading the MSP430FR6989's datasheet.

Please let me know if I have overlooked or misunderstood something! Code snippet is below. Thank you!

void timer_init()
{
    Timer_A_clearTimerInterrupt(TIMER_A0_BASE);

    // TimerA0 UpMode Configuration Parameter
    Timer_A_initUpModeParam param =
    {
        TIMER_A_CLOCKSOURCE_ACLK,               // ACLK clock source ~32.768kHz
        TIMER_A_CLOCKSOURCE_DIVIDER_1,          // ACLK/?? = ??kHz
        32,                                  // debounce period
        TIMER_A_TAIE_INTERRUPT_DISABLE,         // Disable Timer interrupt
        TIMER_A_CCIE_CCR0_INTERRUPT_ENABLE ,    // Enable CCR0 interrupt
        TIMER_A_DO_CLEAR,                       // Clear value
        true                                    // Start Timer
    };

    Timer_A_initUpMode(TIMER_A0_BASE, &param);
}


void delayMS(unsigned long delayTime){
      millis = 0;
      while (millis < delayTime){}
}


#if defined(__TI_COMPILER_VERSION__) || defined(__IAR_SYSTEMS_ICC__)
#pragma vector=TIMER0_A0_VECTOR
__interrupt
#elif defined(__GNUC__)
__attribute__((interrupt(TIMERA0_VECTOR)))
#endif

void TIMERA0_ISR(void)
{
  millis++;
}

  • The default DCO frequency is 8 MHz (CSCTL1) which sources MCLK with a default divider of 8 (CSCTL3), therefore your default CPU frequency is 1 MHz. ACLK meanwhile is sourced from VLOCLK (if LFXT is unavailable, see CSCTL2 description in the User's Guide) which typically operates at 9.4 kHz (Table 5-7 of the Datasheet).

    Regards,
    Ryan
  • Thanks for that! So its really 11% error which is better than what I thought it was at earlier. Does the Timer clock look correct for each call to be 1ms?

  • VLOCLK is not a dependable timing counter. You will lose some cycles while calling the delayMS function, so smaller values of delayTime will produce a reasonable error.

    Regards,
    Ryan
  • I was revisiting this, since I'm still getting a 11-12% error in my experimental requests compared to what the theoretical delay should be. When crunching the numbers, the input clock frequency to the timer has to be 32.768 kHz, if it were at 9.4 kHz, the percent error would be above 90% off. I am mostly testing delays of 1-150 milliseconds and they're consistently off by ~11% which tells me that either the timer calculations are off, that the input timer and the timer clock period are incorrect or that the CPU clock frequency is not at 1 MHz. But I haven't been able to pinpoint where this 11% error is coming from. Do you think it is because of using ACLK as the input clock for the timer?

  • Have you considered outputting your ACLK, MCLK and timer clocks to make sure the frequencies are as expected? You can try using SMCLK to determine if the error lies with ACLK or your measurement method.

    Regards,
    Ryan

**Attention** This is a public forum