This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
Hi,
I need a Timer based delay functions and wrote the followings.
(msp430F5529; CCS Version: 5.3.0.00090; FET430UIF; XTAL1 32768Hz ; XTAL2 4MHz)
int main(void)
{
WDTCTL = WDTPW | WDTHOLD; // Stop watchdog timer
// Basic GPIO initialization
Board_init();
// Set Vcore to accomodate for max. allowed system speed
SetVCore(3);
// Use 32.768kHz XTAL as reference
LFXT_Start(XT1DRIVE_0);
// Set system clock to max (25MHz)
Init_FLL_Settle(25000, 762);
SFRIFG1 = 0;
SFRIE1 |= OFIE;
Clear_Int16_Flag(StatusFlag_1,AllFlags);
Set_Int16_Flag(StatusFlag_1,PowerMode);
Set_Int16_Flag(StatusFlag_1,ShowRtcOnLCD);
*ActiveModeTimeout = 10;
TimeOut_Us=0;
TimeOut_Ms=0;
// Globally enable interrupts
__enable_interrupt();
Test_Delay();
while(1)
{….
// Timer0 A0 interrupt service routine
#pragma vector=TIMER0_A0_VECTOR
__interrupt void TIMER0_A0_ISR(void)
{
if(TimeOut_Us)
{
TimeOut_Us--;
}
else if(TimeOut_Ms)
{
TimeOut_Ms--;
}
else
{
Set_Int16_Flag(StatusFlag_1,TimerTimeOut);
TA0CCTL0 &= ~CCIE;
}
}
void Test_Delay (void)
{
uint32_t i;
for(i =0 ;i<10000;i++)
{
CLK_HIGH;
Delay_10Us(1);
CLK_LOW;
Delay_10Us(1);
}
for(i =0 ;i<1000;i++)
{
CLK_HIGH;
Delay_Ms(1);
CLK_LOW;
Delay_Ms(1);
}
for(i =0 ;i<6;i++)
{
CLK_HIGH;
Delay_s(1);
CLK_LOW;
Delay_s(1);
}
}
void Delay_10Us(uint32_t Time)
{
TimeOut_Us=Time;
TimeOut_Ms=0;
TA0CCTL0 = CCIE; // CCR0 interrupt enabled
TA0CCR0 = 250;
TA0CTL = TASSEL_2 + MC_1 + TACLR; // SMCLK, upmode, clear TAR
while(!Status_Int16_Flag(StatusFlag_1,TimerTimeOut));
Clear_Int16_Flag(StatusFlag_1,TimerTimeOut);
}
void Delay_Ms(uint32_t Time)
{
TimeOut_Us=0;
TimeOut_Ms=Time;
TA0CCTL0 = CCIE; // CCR0 interrupt enabled
TA0CCR0 = 25000;
TA0CTL = TASSEL_2 + MC_1 + TACLR; // SMCLK, upmode, clear TAR
while(!Status_Int16_Flag(StatusFlag_1,TimerTimeOut));
Clear_Int16_Flag(StatusFlag_1,TimerTimeOut);
}
void Delay_s(uint32_t Time)
{
TimeOut_Us=0;
TimeOut_Ms=Time*1000;
TA0CCTL0 = CCIE; // CCR0 interrupt enabled
TA0CCR0 = 25000;
TA0CTL = TASSEL_2 + MC_1 + TACLR; // SMCLK, upmode, clear TAR
while(!Status_Int16_Flag(StatusFlag_1,TimerTimeOut));
Clear_Int16_Flag(StatusFlag_1,TimerTimeOut);
}
And I got the following results.
The UCS register that I copy at run time are in debug mode;
0x000160 --> 10F0 0060 117C 0000 0033 0000 C10D 0C10 0707 0000 0000
From SLAU208M–June 2008 document chapter 5, page 164 I got the formula as;
fDCOCLK = D × (N + 1) × (fFLLREFCLK ÷ n)
As seen form the register my;
D -> FLLD = 2
N+1 -> FLLN+1 = 381
fFLLREFCLK -> Xtal1 = 32768 (I mentioned at the top)
n -> FLLREFDIV = 1
Fdco = 2x381(32768/1)= 24969216 =~ 25 Mhz, and SMCLK source is DCOCLK (SELS = 011b)
I couldn’t understand why I got the result two times longer than I expected in Delay_10Us() and Delay_Ms() routines but not in Delay_s() ?
Thanks and Kind regards.
Hi Aly,
The reason why Delay_10Us and Delay_1Ms take twice as long is because you test for TimeOut_us and TimeOut_ms in the Timer interrupt before setting TimerTimeOut. Since they are set as 1, TimerTimeOut is not set until the next interrupt after they were decremented to zero, taking twice as long. Removing the extraneous logic in the interrupt should solve the problem.
Tony
**Attention** This is a public forum