This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Difference between maximum and avarage execution time

Dear All,

I'm using DSP/BIOS with OMAPL137. I'm trying to benchmark parts of my code using STS_set and STS_delta. Everything seems to be working fine except that the difference between maximum and avarage execution time is rather big (about 4 times). Can somebody please explain to me what  causes this difference?

Thanks,

Peter

  • Peter,

    It typically means just what you think it would: that there is a small subset of outlier values relative to rest of the delta values recorded.

     

  • David,

     

    Thanks for your answer however this is not what I meant. What I would like to know is why the times are different in the first place.

    Let's take an example. Below is the code I'm benchmarking; maximum time is 4 times more than the avarage one.

    real_T result;

    result=sin(0.0);

     

    My current understanding is that this is because sin(0.0) is calculated only at the beginning and then the result is reused since no change in argument is detected. Is my understanding correct? Also, if I had a code like this:

    real_T result;

    result=sin(externVar);

    should I expect the same to happen if there is no change in the externVar?

    And finally, should the behaviour change if I write something like:

    volatile real_T result;

    result=sin(0.0);

     

    I know it's a lot of questions and that they're rather fundamental...

    Thanks,

    Peter

  • You might be taking a timer interrupt or some other interrupt that is affecting the time.


    Try something like this to keep interrupts out of the mix.

     

    key = HWI_disable();

    do benchmark

    HWI_restore(key);

  • Hi,Karl:

    I have the same problem and I try the method as you suggested

    Here is the code:

      key = HWI_disable();

      STS_set(&stsSys0_OutputUpdate, CLK_gethtime());

      Result=cos(5.0);

      STS_delta(&stsSys0_OutputUpdate, CLK_gethtime());

       HWI_restore(key);

    Using the statistic view I found that the Maximum =1375 inst and Average=410.31 , Still the Maximum is about 4 times of Average

     

    or 


      STS_set(&stsSys0_OutputUpdate, CLK_gethtime());

      key = HWI_disable();

      Result=cos(5.0);

     HWI_restore(key);

      STS_delta(&stsSys0_OutputUpdate, CLK_gethtime());

      and the maximum =1714 inst and average=452.09


    Strange thing is that when the benchmark is Result=5.0+3.0, the maximum time is equal to the average time.