I have a simple algorithm running on DSP (in DM3730) that I want to profile. I started profiling by using Server_getCpuLoad. The pseudo code on host microcontroller looks like
process (...);
Server_getCpuLoad(..);
I consistently get a value bet 6 to 8 %. I then decided to corroborate this value using timer, so in my algorithm I have
startTime = TSCL
do the processing
endTime = TSCL
deltaTime = endTime - startTime;
I then compute
load = (deltaTime/DSP_FREQUENCY).
This number is consistently around 1%. How is it possible that the two estimates of load vary so much?
regards
Sachin