I have an algorithm that at 56MIPS should have an execution time of around 24uS.
But with a C5505 @120Mhz it takes 44-50us - with >=120MIPS I would have expected 12us or so.
The code is running in SARAM, parameters/data is placed in DARAM.
Using CLK_OUT it was verified that the DSP is in fact running at 120Mhz. In the test setup , all
interrupts DMA's + I2S's are disabled , and an I/O port used for the actual measurement. No DSP/BIOS is used, its
just a simple loop calling the algorithm. Code is loaded using CCS and a XDS100 interface.
Any idea's why I dont see the expected performance ?
- other than the specified execution time for the algorithm being wrong.