This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

EDMA Delay or Jitter Documentation

Other Parts Discussed in Thread: SYSBIOS

I'm trying to find any documentation about how long does it take edma to react. We are seeing around 200 ns of jitter on a SPI line that is triggered by edma. We suspect an impedance issue but it will be nice to know how fast is EDMA able to copy data from location to another.

I'm working on sysbios, the omapl137, the dsp side and the only interrupts are for the edma cc and edma errors. Edma is fired every  SPI 1024 samples which should be around 1.7 ms.

Jaime

  • Do you see this jitter just at the beginning of a EDMA transfer? Are you DMA'ing to DRAM? I've seen jitter randonly occurring during a EDMA transfer to DRAM. Switching to L2 or L3 SRAM eliminated the jitter for me. I am guessing the DRAM row refresh occasionally blocked the DMA transfer.
  • Thank you Norman,

    That was a great suggestion as most likely the buffers, declared as global, were probably landing in DRAM. I added a pragma directive to place them in IRAM. I think that and other changes we made brought down the jitter to ~25 ns.

    Does that sound realistic? This jitter is seen in the SPI CS so it may not be just the edma event jitter. (I guess it is hard for you to tell since we are working on a custom setup, where a GPIO through a wire, is firing EDMA). That is why it will be useful to know if there is any documentation available on the expected jitter for edma to copy a value.

    How much time, not jitter, should I expect edma to take to copy 1 byte?
  • It has been quite while since I worked on my project. I don't remember measuring latency between starting an EDMA transaction and when the EDMA actually started moving data. In my case, I was using SPI to transfer several 16-bit samples from an ADC. The number of samples varied from 20 to 200 per sequence. I was seeing random delays in the middle of the sequence. Not good for digital filter math. I was much more concerned with the regular timing of consecutive samples rather than how long it took to start. Once I switched to L2 RAM, my samples were exactly regular. I did not see any jitter in the middle. The time to transfer 1 word was exactly SPI clock period x bits plus all the pre and post delays.

    I would guess that your improvement from 200ns to 20ns might be due to just moving from slower DRAM to faster internal SRAM. The jitter problem may not be solved. If you are asking about GPIO latency, there are a few threads on this forum about the slowness of GPIOs. It can take quite while to get signals in and out of the GPIO controller.

    You should unverify my post. It will hopefully bring some TI eyes onto your request.