This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Capturing one frame

I need to figure out a method to capture one frame  from an image sensor on DM355. My image sensor (MT9P031) is operating in a snapshot mode and produces a single frame on demand (trigger).

Is there an interrupt I can use that would indicate that the whole frame was captured? It appears that in current implementation (DVSDK 3.01)  the start of the next frame triggers an interrupt indicating the the previous frame can be processed. In my case, the next frame never comes. So, I need some kind of notification that the whole frame was captured and saved to SDRAM.

Thank you.

  • Hi Gennadiy,

    The simplest thing would be in the application you queue 3 buffers and capture one frame and discard the other two frames. or in the driver after completion of the isr you check if the next buffer is available if it isnt you just stop the stremon.

    Best Regard's

    Prabhakar Lad

  • Yes. There are two ways to make this work for you.

    • In application:  queue 3 buffers(must for streaming driver), but, just deque only single buffer. You got your single-frame.
    • In driver: If you think above suggestion is a workaround, you can change capture driver to capture single buffer. You need to change one of the buffer management functions, and in ISR as you have turn off streaming as soon as you get first completed frame. Also you need to change INT registers(in CCDC if you are capturing from CCDC) to trigger interrupts when all the lines are captured in SDRAM.
  • Here is how I was able to implement (used two tricks).

    #1. I could not allocate 15Mbytes of data for three buffers (my board has 64M total). So, I allocated one buffer from CMEM and queued it to the driver three times :) (clever, yeah?). Sounds like buffer fraud, but it worked and no buffer police has not knocked on my door yet. This has solved my problem of multiple buffers.

    #2. However, I still had to trigger the frame two times. To solve that problem, I had to change the INT0 in the driver to trigger on the last line instead of the first line. The driver uses INT0 to release the buffer. Originally  (when set to the first line of a frame) would release the buffer at the beginning of the next frame. After changing it to the last line, the buffer is released at the beginning of the last line of the current frame. This allowed the driver to release the buffer without waiting for the second frame. I just need to be careful not to start processing the frame from the last line as it may still be recording (which should not be a problem given the fact the my pixels come at a rate of 66MHz).

    I think, the second change still allows me to use the driver to capture video. There is one advantage to my change though. Capture to process frame latency is a bit less since I don't wait any longer for the next frame and start processing this frame immediately. I just need to make sure that the frame is not processed withing 35 microseconds as this is how long it takes to capture one line. It is clear the it is not possible (event if the whole frame was fed into the IPIPE).

     

  • Gennadiy Kiryukhin said:
    #1. I could not allocate 15Mbytes of data for three buffers (my board has 64M total). So, I allocated one buffer from CMEM and queued it to the driver three times :) (clever, yeah?). Sounds like buffer fraud, but it worked and no buffer police has not knocked on my door yet. This has solved my problem of multiple buffers.

    :) It is not a buffer fraud. Buffer management doesnt put any restriction on the use-case you mentioned.

    Gennadiy Kiryukhin said:
    #2. However, I still had to trigger the frame two times. To solve that problem, I had to change the INT0 in the driver to trigger on the last line instead of the first line. The driver uses INT0 to release the buffer. Originally  (when set to the first line of a frame) would release the buffer at the beginning of the next frame. After changing it to the last line, the buffer is released at the beginning of the last line of the current frame. This allowed the driver to release the buffer without waiting for the second frame. I just need to be careful not to start processing the frame from the last line as it may still be recording (which should not be a problem given the fact the my pixels come at a rate of 66MHz).

    .

    Excellent, My suggestion was exactly same. Even if you indicate HW(CCDC) to stop capuring, it might shut down only during the sync period. Not sure about this. Glad that you could capture single frame.

    Gennadiy Kiryukhin said:
    I think, the second change still allows me to use the driver to capture videoI s. There is one advantage to my change though. Capture to process frame latency is a bit less since I don't wait any longer for the next frame and start processing this frame immediately. I just need to make sure that the frame is not processed withing 35 microseconds as this is how long it takes to capture one line. It is clear the it is not possible (event if the whole frame was fed into the IPIPE).

    Well, for this what I suggest is to make a driver change to support single buffer case also. Application should be able to request for 1 buffer from v4l2 device. If its one buffer, let it be snap-shot mode. Otherwise, it will be 3 buffers and video can be captured.