I configured my usart module as the following:
- Baud rate = 1200 bit/sec.
- 8 bit character.
- One stop bit.
- Odd parity check.
The slot (timing) for a single 8 bit character, as I understand, is (11/1200)*1000 = 9.17 ms (11 bit = 1 start bit + 8 bit of a single character + 1 parity bit + 1 stop bit).
My goal is to set up a reliable time-out logic for the possible "gap" between each received byte. And I have tried the following setting-up:
inside the usart reception handler,
#pragma vector = UART0RX_VECTOR __interrupt void USART0_RX(void) { //some code to stop timer A here. .... expiry_Timer_stop();
rxBuffer[rxIndex++] = U0RXBUF; //some code to start timer A here. .... expiry_Timer_start(some_tick); // And I have some logic to prevent timer A from starting after the last byte of the incoming frame has arrived. // The some_tick parameter specifies when would timer expired in Up-Mode.
// The four dot string "...." omits some logic control which I think is not relevant to my question. }
I set the some_tick to the value making timer A counting up to 9.17 ms (which as the protocol told is the max allowed gap between byte characters). And Timer A
works in up-mode, so TAR counts up to some_tick, mcu enters the __interrupt routine:
#pragma vector = TIMERA0_VECTOR __interrupt void Timer_A() { receptionTimeOut = TRUE; _NOP(); }
The boolean flag receptionTimeOut is for the main loop to detect the time-out between bytes. And the main() function basically only has two state (Transmission and Reception) jumping from each other.
The problem is that this approach is not always working . And the way I test this setting-up is just send some normal desired frame (the frame that the mcu is programmed to consider being valid) from PC, but the expiry_timer timed out (mcu enters __interrupt void Timer_A() ) every time a byte is received. So then I set the some_tick to be a bit longer than 9.17 ms. Say 10 ms or 12 ms, then mcu some time works as I wished, and other times still time-out.
My question is, would this setting-up be a working solution? Or did I make it too complex to test?
And I think there might be flaw in my timer configuration. So I posted the relevant code here:
void expiry_Timer_init() { TACTL |= (TASSEL1 | TACLR | MC0); // clock source from SMCLK, up-mode counting. TACCTL0 |= CCIE; } void expiry_Timer_Stop() { TACTL = 0x00; } void expiry_Timer_Start(unsigned int interval) { if (interval > 0) { TACTL |= (TASSEL1 | TACLR | MC0); // clock source from SMCLK, up-mode counting.s TACCR0 = interval; TACCTL0 |= CCIE; } }