Hello E2E Community,
We have an application that is modeled using the VPIF (facedetect) example that captures D1 frames at 30 frames per second (fps). We added an EDMA3 task that is modeled using the EDMA (edmatest) example to subsample the captured D1 frames down to 360 x 240 for our image processing algortithm. The EDMA3 transfer has being tested and works correctly taking only 2 milliseconds to run. The example code is used as it with minimal changes.
When we run the application without the EDMA3 enabled, it captures the D1 frames at 30 fps without any degradation. We have tested this overnight without any problems.
When we enable the EDMA3 transfer (subsampling) the application will run for some time and than the frame rate will drop from 30 fps to about 3 to 5 fps. Eventually, it will recover back to 30 fps. This will run for some time and than start over again going from 30 fps down to 3 to 5 fps.
I am at my wits end this whole weekend trying to figure out what might be the cause of this problem. Can anyone shed some light to this problem?
Thanks,
JumpStart