This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

ividdec2 buffer management

hi all,

      In ividdec2 interface, both the output buffer and the decoded picture buffer (DPB) are merged together  and buffer management is handed over to the application.

Does the application  need to allocated a buffer of "max_number_of_reference_frames" which the codec returns based on the level or it should be "max_number_of_reference_frames + max_display_delay" ???

In ividdec2 the codec can output multiple number of frames per process call, Is it because of this advantage the number of buffers that the application need to allocate comes down to "max_number_of_reference_frames" ??

Then how does the codec know how many buffers it can output during that process call???

 

 

Thanks

Aravind

  • Aravind,

    In ividdec2, the reference buffers and display buffers are merged mainly to avoid extra frame copies and save on DDR footprint.

    The number of buffers to be allocated by the application must be "max_number_of_reference_frame + 1" where max_number_of_reference_frames is returned by the codec after the first process call where the decoder gets to know the resolution, level etc. and the + 1 buffer is for current frame.

    The buffer allocation is not really governed by the max_display_delay. Based on a particular content, we can set the max_display_delay to 1 and get frames dispalyed every process call or we can set max_display_delay to 5 and get frames displayed one by one after 6th process call. When the decoder encounters an IDR frame or a DPB flush command, if the number of reference frames locked in the DPB is more than 1, then the codec outputs multiple frames in a single process call and resets DPB state.

    Shyam