This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Pass along MV/SAD information outside IPNC?

We're using the IPNC to perform capture and encode.  We want to pass the MV/SAD information to our own process for further analysis.  I've looked at modifying the message queue stuff to implement this, but it seems there's a limit on the size of message that can be passed over that makes handing MV/SAD stuff prohibitive.

Is there some way to easily modify the app to pass this information through CMEM?  Or some other method to easily share this information with another process?

Thanks!

  • Trying to modify IPNC to pass this info through CMEM; I'm struggling to understand the CMEM implementation used in the IPNC.

    What's confusing about it is audio and video encode seems to use a different buffer mechanism, implemented in osa_buf.c and osa_queue.c.  osa_buf seems to create two osa_queues, one "empty" queue and one "free" queue.  osa_queue itself seems to use osa_memAlloc(size) for all memory allocation, which is defined as:

    #define OSA_memAlloc(size)      (void*)malloc((size))

    ...so the question then becomes: how does the output buffer then get into CMEM?  And what is CMEM being used for if it isn't being used for actual buffers? 

    After tracing through the code, I think I understand what's happening.  So for a video frame, there's a heap-allocated buffer for the input image (i.e. osa_buf/osa_heap), and a heap-allocated buffer for the output.  Once that's done, stream_write() gets called.  That tries to find a place in CMEM to copy the frame into, and then it gets memcpy'd into CMEM.  From there, we lock addresses in CMEM, and they're valid for a certain amount of time.  I'd assume we could break stuff by locking for too long.

    So CMEM serves two purposes:

    1. Because CMEM addresses map to a physical address, it can be used as a form of inter-process communication.
    2. A history (of unclear duration) of audio and video data.  CMEM is sort of the local scratch page for audio and video data flowing through the system.

    Is this correct?

  • I made the changes, and I'm having DMA errors:

    DRV ERR :DMA_copyFillRun:250: Illegal parameter (chId = 53)


    It's not clear to me how DMA and CMEM are related, but I've definitely managed to break something.  Is there anyone who's tried to do this sort of thing with the IPNC?  I really need to pass this information through to our application for further analysis...

  • Hi,

    Answered the question offline over email - In current implementation, we are already using CMEM memory for the MV/SAD output buffers. The memory is allocated from the CMEM pool or heap and is used till the time the encoder handle is open.

    Regards,

    Anshuman

    PS: Please mark this answer as verified, if you think it has answered your question. Thanks.

  • Anshuman,

    If you review the post, my question is _not_ about CMEM.  My question is how I can pass MV/SAD to a separate process, much like what is done with current audio and video.  Do you have any idea how to do this?

    Thank you.

  • Hi Jeremy,

    Sorry for the delayed reply. Actually, for transferring MV/SAD values across different processes, you can use any inter process communication tools in the kernel. We use message queue and shared memory to do the same.

    Regards,

    anshuman

    PS: Please mark this post as verified, if you think it has answered your question. Thanks.

  •  

    You can send the"Motion meta data" along with the Encoded Stream  to pass it though other process.  For this you might need to increase the Encoder 1 shared memory .  For this you might also need to modify videoEncodeThr.c.

    The example code is shown below:

       memcpy(encPrm.outAddr+pOutBufHeader->encFrameSize,  gmd.Buffer+2, MVDSize );

       pOutBufHeader->encFrameSize+=(MVDSize);

    Put little bit of inteligence along wiht the Motion meta data to find the borders of the Meta data along with the Encoded Stream.