This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TI SCD optimization in DSP



Hello friends,
we use DVRRDK_03.00_ in DM8168. and we implement our motion detection algorithm in SCD.
By testing, there're 600~800us difference of process time per frame between TI's and our algorithm.
We tried to follow the optimization steps inside "Introduction to TMS320C6000 DSP Optimization.pdf"
to improve our performance and we found that there's no obvious time difference.

Do you have any idea for that?