Hi all,
I benchmark a computational intensive app.
I use a RTI counter to get 1µs resolution timestamps.
When I run the app compiled in 32 bit code from the Flash or the SDRam, I see large jitters of about 500µs on a total time of 200ms.
When I run the same app compiled in 16 bit code from the SDRam, I see only very little jitters of 1 or 2µs.
Can anyone give me an entry point to an explanation of such a behavior ?
Thanks for your answer
Marc