This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

DVSDK sample red herring?

I've been using the DVSDK 4 sample code as a basic for my application on the DM368.  One of the things that the code does is clear the buffer mask using Buffer_setUseMask.  I was trying to find the logic for this and it turns out that there appears to be none.

So I was wondering if anyone has a feel for how it's to be used?  Are there any DMAI routines that use the mask or is it strictly intended to be used by the application? 

If I use it for my own purposes, will I break something else?  Too bad it's only 16 bit as I'd like to put a timestamp in my buffers, and there appears to be no "user" parameters.  I guess I could define a new extended buffer type like _BufferGfx_Object.

John A

  • Hi John,

    The initial useMask is set when creating the buffer. As you mentioned, it has 16 bits, but typically I see only a few being used. The useMask idea is so that different stages of processing can be recorded as part of the buffer. A buffer is considered "in use" when it is retrieved by BufTab_getFreeBuf. After undergoing each stage of processing, a bit would be cleared in the useMask, until it becomes "free" again after the useMask becomes 0. For instance, in the DVSDK decode demo, buffers containing the display frames are produced by the video thread and displayed by the display thread. Typically, the initial useMask would have 2 bits set. One would be cleared after it is no longer needed by the decoder, and the other is cleared after the buffer has been displayed. This is to ensure a buffer is not reused until it is no longer of use to both the decoder and the display driver.

    Now coming to your use case. I think it is definitely different from the intended usage of the useMask field. If you use the BufTab module in your code it would likely break when you use the useMask as a way to carry custom data. Extending the Buffer type would be a better approach from a software engineering standpoint.

    Best regards,

    Vincent

  • Hi Vincent,

    The reason I referred to it as a "red herring" is because the decode demo application appears to use the buffer mask but at no time do the bits ever get set.  The sample app goes through the motion of clearing the bits, but if you examine the mask at any time it's always clear.  So the application really gices little insight as to how it was intended to be used.

    So I wondered if there was any type of reasoning to put forth as to how it should be used.  For example, in the decode app there is a  function called "handleCodecBufs" that calls Venc2_getFreeBuf repeatedly clearing the mask for each buffer returned.  However, nothing ever set the mask bit in the first place.  Should I set the mask bit before calling Venc2_process?

    Since nothing else ever uses the mask bit, it does matter whether I set it or clear it.  Which begs the question... what's the point?  Since the buffers are sent back to the fifo without regard to the "freeing" indicator of Venc2_getFreeBuf, has the decode demo left out somethign crutial to proper operation?  Should I be holding a codec buffer elsewhere and not return it back to the fifo until it really is freed?

    Since DMAI is supposed to be an easier way to use the codec I am cautious as to it also being fraught with pitfalls.  That's why when I see something like this I try to get as much clarification as possible.

    John A

  • Hi John,

    The use mask is set during initialization that is done in BufTab_create. Taking the decode demo as example, if you look at decode/video.c, you will see code similar to this depending on the version of the demos you are looking at:

        /* Both the codec and the display thread can own a buffer */
        gfxAttrs.bAttrs.useMask = CODEC_FREE | DISPLAY_FREE;

        /* Color space */
        gfxAttrs.colorSpace = colorSpace;

        /* Set the original dimensions of the Buffers to the max */
        gfxAttrs.dim.width = params->maxWidth;
        gfxAttrs.dim.height = params->maxHeight;
        gfxAttrs.dim.lineLength = BufferGfx_calcLineLength(gfxAttrs.dim.width,
                                                           colorSpace);

        /* Create a table of buffers for decoded data */
        hBufTab = BufTab_create(NUM_DISPLAY_BUFS, bufSize,
                                BufferGfx_getBufferAttrs(&gfxAttrs));

    This initializes the useMask by setting the CODEC_FREE and DISPLAY_FREE bits. BufTab_getFreeBuf returns a buffer, its bits are cleared over time, and then the buffer is "freed". When BufTab_getFreeBuf is called again, its useMask is reset to the initial value, and the buffer is reused.

    As I mentioned, the point of this useMask is to ensure both decoder and display thread are done with a buffer before it is reused. This can be tricky without DMAI because depending on the codec a buffer may be displayed before it is released by the codec and vice-versa.

    Hope this makes sense!

    Best regards,

    Vincent

  • Vincent,

    I was wrong to say they never get set.  But they never get set during the actual operation.  Setting them when the buffers are created doesn't seem to make sense because they should be set when they are about to be "owned" by some process.  Setting them when they are created seems to be the opposite of their intended use.  And to add to this the name xxxx_FREE seems to be the opposite of the actual use.

    Maybe I'm a bit confused because the "FREE" bit is cleared by the code after it's freed.  Seems like if it's really FREE as the bit name says then it should be set when freed.  So I'm guessing in this case FREE means not free.  Regardless once the buffer has gone through one round of decoding/displaying the FREE bits are no longer used.

    Since the useMask keeps track of the buffer's "in use" status, the bits should toggle continuously during the operation of the program.

    John A

  • One reason why I'm pushing this point is because I *think* there are times when a buffer is sent back to the decoder after display, but may still be in use by the decoder.  I had some points in time where the decoded picture had garbage in it and I think this was the issue.  IIRC it went away when I increased the number of display buffers in my app.  I speculated that having more display buffers prevented a display buffer from being recycled back to the decoder before it was freed.

    John A

  • Hi John,

    To be more precise the useMask is only initialized in BufTab_getFreeBuf when a buffer is handed to the application. If you look at the source of the BufTab module, you will see that BufTab_create only records the the original useMask. That being said, the application can also set extra bits using Buffer_setUseMask if it only wants the bits set at specific points in time. In fact, in the next revision of the decode demo, we'll have a use case that only sets the DISPLAY_FREE bit at the point where the buffer is given to the display thread.

    The key thing to remember here is that a buffer is considered free only when its mask is 0. So in your application, make sure that you clear the CODEC_FREE bit on a buffer only after you get it back from Vdec2_getFreeBuf. If you do that it will prevent the application (ie. the BufTab module) from reusing a buffer before the codec relinquishes it.

    Best regards,

    Vincent

     

  • Vincent,

    Since you mentioned BufTab_getFreeBuf, I went a took a look at that function.  It appears to set the mask when a free buffer is found.  I think this will help me figure out how this is supposed to work.  I will examine what's going on tomorrow and see if I can get my ducks in a row then report back.  This is probably the clue I needed.

    Thanks,

    John A

  • Vincent,

    I now understand what is going on in my app.  I was confused by the logic in the decode application and I left out a section that didn't seem to make sense.

    In the decode app a buffer was retrieved from the display out fifo and then was "freed". What confused me was... "why not use the buffer from the display just freed?"  Instead the decode app called Buffer_getBuf and got a different buffer.  Since I didn't understand what was going on I just used the buffer from the display fifo and it seemed to work.

    Now I understand that when you free a buffer it's marked again as available only if all the owners free it.  This allows Buffer_getBuf to retrieve it.  And even more importantly the display out fifo isn't for reusing the buffer, but for sending it to be "freed".

    At this point I have to wonder what is the logic of sending the used display buffer back through the fifo to free it.  My guess is that it's solely to allow the thread to keep track of the number of display buffers currently in use so that a shutdown will wait until the last display buffer has been displayed.

    Thanks for all your help.

    John A

  • Vincent,

    This isn't really important and I can certainly see what's happening by looking at the code.  But I wanted to see if you have any thoughts on a section of code in the the decoder app.  The following is in the code....

    hDstBuf = Buftab_getBuf(hBufTab, Buffer_getId(hDispBuf));

    Buffer_freeUseMask(hDstBuf, DISPLAY_FREE);

    The code doesn't seem to do anything different than the simple one line "Buffer_freeUseMask(hDispBuf, DISPLAY_FREE);" except for checking to make sure that the buffer was in hBufTab.

    John A

  • Hi John,

    Yes, your interpretation is correct. The line Buffer_freeUseMask(hDispBuf, DISPLAY_FREE); would have been sufficient.

    Best regards,

    Vincent