I'm using a MSP430F5510. Basically, at a certain point of my program I have an interrupt function executed with a frequency of 1.5MHz (the interrupt is on the rising edge of an external clock with this frequency). The microcontroller runs at 24MHz, so it has 16 clock cycles to execute it, right? The function is the following:
#pragma vector=PORT2_VECTOR
__interrupt void Port_2(void)
{
P2IFG &= ~BIT0;
buffer[cnt] = P1IN;
cnt--;
if (!cnt)
P2IE &= ~BIT0;
}
'buffer' and 'cnt' are global variables defined in RAM memory. I read from other threads to watch the CCTIMER1 while debugging to estimate the number of clock cycles. I do it while debugging with the FET debugger, but I don't understand how it counts. If I debug line by line, it counts 16 clock cycles. If I put a breakpoint at the beginning of the function and one at the end of the function, and I let it run, it counts 21 clock cycles. I know that it is not so accurate, but which one of the two cases is more "real"?