Tool/software:
Hi everyone!
Using different high level libraries like gstreamer or OpenVX, I have always measured at least 3 frames of delay between light stimulation of my MIPI CSI-2 sensor and the resulting "stimulated frame" in the linux user space. I am using the ISP to de-bayerize the images coming from my sensor.
A delay of 1 frame could be attributed to my sensor, but the rest should be on the AM62A side. Looking at the J721E documentation, I found this interesting table:
Instance |
Configuration |
Time taken to receive one frame |
ISR latency |
---|---|---|---|
CSI2Rx Inst 0 |
1CH 1080P30 IMX390 Sensor Raw12 |
33.3ms (MCU2_0) |
9us (MCU2_0) |
Does that mean that another frame of delay comes from how j721e-csi2rx handles the incoming MIPI stream ? The last frame of delay would then be coming from the v4l2 driver ?
My need is to develop a piece of software that would process some of the MIPI data "on the fly", with the smallest latency possible. I don't need the ISP to process the data. Where should I start ? Am I missing something ?
Any information would be greatly appreciated,
Thanks!