This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

How to measure decoding time?

Hi,

I want to measure decoding time of each channel in rdk02.00.00.23. I set an time stamp in decLink_common.c function DecLink_codecSubmitData() and set another time in function DecLink_codecGetProcessedData(). I think this difference should be the decoding time, right? I know the message "DEC_LINK_CMD_GET_PROCESSED_DATA" is sent about 4msec period, so it might have 4msec inaccuracy, but I find the inaccuracy sometimes more than 4msec. Who can suggest another good method to measure decoding time?

Thanks!

Jacson

  • Do you want to measure IVA_HD processing time for a channel or M3 portion + IVA_HD portion ?

    RDK supports processN APIs which performs decoding of multiple channels in a single process call. SO to measure decoding time of a single channel you have to first disable processN APIs in system_debug.h

    To measure pure IVA_HD process time you need to measure from HDVICP_wait (Before Semaphore_pend and After Semaphore_pend). These functions are present in file hdvicp_fw_config.c

    To measure the M3+IVA_HD process time you need to measure before and after process() call in decLink_h264.c file.

  • Hi Badri Narayanan,

    Thanks for your response, I will try it later.

    Jacson