This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Discrepancy between high resolution and low resolution clocks

Hi,

I'm trying to benchmark software running on a c6424 EVM board, and I'm seeing a discrepancy in the times reported by the high and low resolution clocks.  Here is an example:

    printf("Long Time Test...\n");
    t = CLK_getltime();
    TSK_sleep(1000);
    t = CLK_getltime() - t;
    printf("\tDone! Took: %f ms\n", ((float) t * CLK_getprd()) / CLK_countspms());

    printf("Short Time Test...\n");
    t = CLK_gethtime();
    TSK_sleep(1000);
    t = CLK_gethtime() - t;
    printf("\tDone! Took: %f ms\n", ((float) t) / (float) CLK_countspms());


The DSP clock is running at 594 MHz, and I have the low resolution clock configured for 100 us intervals.  When I execute the above code, I see that the elapsed time reported using CLK_getltime() is 100 ms, while the time reported by CLK_gethtime() is around 82 ms.  I've verified that this isn't a wrap-around problem, but I'm not sure what to check next.


Any thoughts on why this would occur?


Thanks