This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Is it bad to spend more time in an interrupt than in a main while() loop?

Hi guys, I'm following signal processing project that use TI Concerto ADC to sample signal data and then process it. My question is that: Would the MCU be more and more unstable if I spend too much time in an ADC interrupt before returning to the main while loop? It seems that I get quite a lot of illegal ISR when I increase the amount of computation inside an ADC interrupt routine. However, if I reduce the burden of the ISR, the algorithm becomes more stable. I wonder if there is a clock cycle limit for an ADC interrupt. 

  • However, if I reduce the burden of the ISR, the algorithm becomes more stable.

    That might tend to happen but there's no limitation as such. For an example: I'm using a timer ISR which consumes 97% of the total time and only 3% is left for a quite big main routine! I never faced any issues till now.

    Regards,

    Gautam

  • Hi Hac,

    ADC sampling is usually periodic, so if the ISR that handles the samples takes too long, the next ISR may arrive while the previous one is still executing, resulting in an error.

    This may not be deterministic because the ISR could have code that doesn't take a deterministic amount of time (branches, loops, etc.) or because the code in main() can affect the ISR latency (see below link).

    http://processors.wiki.ti.com/index.php/Interrupt_FAQ_for_C2000

    If the ISR is being overridden by the next ISR, then you can try to decrease the ADC sampling rate or decrease the calculation time.  Depending on your application, you may also be able to do something like use the DMA to move ADC samples to a buffer, then do your calculations in main() instead when the buffer is full of the desired number of samples.

  • Hi,

    The ePWM Timer that triggers my ADC has variable TimerBasePeriod, so ADC sampling rate is not fixed, and I guess this is my problem. Anyway, the ITRAP is remedied, but now I'm occasionally getting ADC Interrupt Overflow Flag. As Devin said earlier, I moved some of the computation tasks to the main loop and things were OK, no Interrupt Overflows, but the algorithm accuracy was at risk. Nevertheless, I have to deal with it somehow.