hi all,
In ividdec2 interface, both the output buffer and the decoded picture buffer (DPB) are merged together and buffer management is handed over to the application.
Does the application need to allocated a buffer of "max_number_of_reference_frames" which the codec returns based on the level or it should be "max_number_of_reference_frames + max_display_delay" ???
In ividdec2 the codec can output multiple number of frames per process call, Is it because of this advantage the number of buffers that the application need to allocate comes down to "max_number_of_reference_frames" ??
Then how does the codec know how many buffers it can output during that process call???
Thanks
Aravind