This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Video display delay in dm365 decode

h264dec version:H264DEC.version.01.00.00.08.wizardversion.0.5.3
dvsdk version:dvsdk_2_10_00_17

The main parameters in the programme:

VIDDEC2_Params  *params:
params->maxFrameRate = 25000;
params->maxWidth = 720;
params->maxHeight = 576;
params->maxBitRate = 1048576;
params->forceChromaFormat = XDM_YUV_420SP;
paras->dataEndianess = XDM_BYTE;
params->size = sizeof(IH264VDEC_Params);

VIDDEC2_DynamicParams  *dynParams:
dynParams->size = sizeof(VIDDEC2_DynamicParams);
dynParams->decodeHeader = XDM_DECODE_AU;
dynParams->displayWidth = 0;
dynParams->frameSkipMode =  IVIDEO_NO_SKIP;
dynParams->frameOrder = IVIDDEC2_DECODE_ORDER;
dynParams->newFrameFlag =0;
dynParams->mbDataFlag=0;

IH264VDEC_Params extnParams:
extnParams.displayDelay = 0;
extnParams.hdvicpHandle = (void*) NULL;
extnParams.resetHDVICPeveryFrame = 0;
extnParams.disableHDVICPeveryFrame =1;
extnParams.viddecParams = *params;

BufferGfx_Attrs   gfxAttrs= BufferGfx_Attrs_DEFAULT:
gfxAttrs.bAttrs.useMask = CODEC_FREE | DISPLAY_FREE;
gfxAttrs.colorSpace =ColorSpace_YUV420PSEMI;
gfxAttrs.dim.width = params->maxWidth;
gfxAttrs.dim.height = params->maxHeight;
gfxAttrs.dim.lineLength = BufferGfx_calcLineLength(gfxAttrs.dim.width,colorSpace);

In the programme i add a time print like below:

gettimeofday(&uTimeStart, NULL);
ret = Vdec2_process(hVd2, hInBuf, hDstBuf);
if (ret < 0)
{
ERR("Failed to decode video buffer\n");
cleanup(THREAD_FAILURE);
}
gettimeofday(&uTimeStop, NULL);
timeDelta = (unsigned long)((uTimeStop.tv_sec-uTimeStart.tv_sec)*1000000+(uTimeStop.tv_usec-uTimeStart.tv_usec));
fprintf(stderr,"videoThread decode time: %ld ms,%ld us\n",timeDelta/1000, timeDelta);

The time print results are :
videoThread decode time: 11 ms,11938 us
videoThread decode time: 12 ms,12230 us
videoThread decode time: 11 ms,11805 us
videoThread decode time: 12 ms,12171 us
....
videoThread decode time: 13 ms,13790 us
videoThread decode time: 15 ms,15033 us
videoThread decode time: 13 ms,13749 us
videoThread decode time: 14 ms,14163 us
videoThread decode time: 13 ms,13908 us
...
videoThread decode time: 11 ms,11765 us
videoThread decode time: 12 ms,12190 us
videoThread decode time: 15 ms,15088 us
videoThread decode time: 11 ms,11741 us
videoThread decode time: 11 ms,11934 us
videoThread decode time: 12 ms,12241 us
...

I want to decode the video streams which i recv from rtsp pthread and display by a LCD TV & MONITOR.(The programme decode flow is the same with the demo )
My problem is :
The programme run correctly in the beginning, display no delay ,but after several hours , the display delay for 1~3 seconds(show by the monitor).
I really don't known how to debug it , like the time print show , it is something abnormal with the "Vdec2_process"??
Pls help me , any advice will be appreciate.

Best ragards,

Katee
 

  • Hi:

         we had also meet the same problem of video display delay using DM368. In my application, I receive H.264 stream using VLC in PC,  the delay is exist at the beginning.

    anybody have good idea?

    Thanks a lot.

     

  • Katee,
    I have decreased display delay by decreasing the DISPLAY_PIPE size define in video.c, from default 18 to 4. 
    I'm not sure if this can somehow help your problem. Since the problem occurs after some time, i think the delay might be related to RTSP receiving code.

    Gomo,
    You can decrease delay in VLC by changing : Tools->Preferences->Input/Codecs->Demuxers->RTP/RTSP->CashingValue
    to about 300 ms, or try to decrease that value, until it works appropriate.

    Regards,
    Marko

  • Marko,

    Thank u so much for your help ! I will try your advice later .

     But i still confused about my parameters setting . Like i showed above , did them correct ??? Especially for  dynParams->frameOrder = IVIDDEC2_DECODE_ORDER , 
    the default value is IVIDDEC2_DISPLAY_ORDER , since i need  to decode the video from RTSP , which one should i set ???

    Regards,

    Katee

  • Recently , i found something abnormal when the programme running.

    I added a time print like below:

    video.c

    /* Main loop */

        while (!gblGetQuit()) {

            if (ret != Dmai_EFIRSTFIELD) {

     

    gettimeofday(&uTimeStart, NULL);

     

                /* Get a displayed frame from the display thread */

                fifoRet = Fifo_get(envp->hDisplayOutFifo, &hDispBuf);

     

    gettimeofday(&uTimeStop, NULL);

    timeDelta = (unsigned long)((uTimeStop.tv_sec-uTimeStart.tv_sec)*1000000+(uTimeStop.tv_usec-uTimeStart.tv_usec));

    fprintf(stderr,"Fifo_get time: %ld ms,%ld us\n",timeDelta/1000, timeDelta);

       

                if (fifoRet != Dmai_EOK) {

                    cleanup(THREAD_FAILURE);

                }

                /* Did the display thread flush the fifo? */

                if (fifoRet == Dmai_EFLUSH) {

                    cleanup(THREAD_SUCCESS);

                }

     

                /* The display thread is no longer using the buffer */

                hDstBuf = BufTab_getBuf(hBufTab, Buffer_getId(hDispBuf));

                Buffer_freeUseMask(hDstBuf, DISPLAY_FREE);

     

                /* Keep track of the number of buffers sent to the display thread */

                numDisplayBufs--;

     

                /* Get a free buffer from the BufTab to give to the codec */

                hDstBuf = BufTab_getFreeBuf(hBufTab);

     

                if (hDstBuf == NULL) {

                    ERR("Failed to get free buffer from BufTab\n");

                    BufTab_print(hBufTab);

                    cleanup(THREAD_FAILURE);

                }

            }

    dispaly.c:

     

     while (!gblGetQuit()) {

            /* Pause processing? */

            Pause_test(envp->hPauseProcess);

     

            /* Pause for priming? */

            Pause_test(envp->hPausePrime);

     

            /* Get decoded video frame */

            fifoRet = Fifo_get(envp->hInFifo, &hSrcBuf);

     

            if (fifoRet < 0) {

                ERR("Failed to get buffer from video thread\n");

                cleanup(THREAD_FAILURE);

            }

     

            /* Did the video thread flush the fifo? */

            if (fifoRet == Dmai_EFLUSH) {

                cleanup(THREAD_SUCCESS);

            }

           

            BufferGfx_getDimensions(hSrcBuf, &srcDim);

     

    gettimeofday(&uTimeStart, NULL);

     

            /* Get a buffer from the display device driver */

            if (Display_get(hDisplay, &hDstBuf) < 0) {

                ERR("Failed to get display buffer\n");

                cleanup(THREAD_FAILURE);

            }

    gettimeofday(&uTimeStop, NULL);

    timeDelta = (unsigned long)((uTimeStop.tv_sec-uTimeStart.tv_sec)*1000000+(uTimeStop.tv_usec-uTimeStart.tv_usec));

    fprintf(stderr,"Display_get time: %ld ms\n",timeDelta/1000);

     

    I decode the video recv from rtsp which captured by a camera .  I run the programme , in the beginning, the display is normal , no delay , and the time print are :

    Fifo_get time: 0 ms

    Display_get time: 0 ms,57 us

    Fifo_get time: 0 ms

    Display_get time: 0 ms,60 us

    Fifo_get time: 0 ms

    Display_get time: 0 ms,60 us

    Fifo_get time: 0 ms

    Display_get time: 0 ms,58 us

    Fifo_get time: 0 ms

    Display_get time: 0 ms,57 us

    Fifo_get time: 0 ms

     

    When i let the camera target at  white/bright or black/dark scene, then the print are so abnormal:

     

    Display_get time: 0 ms,590 us

    Fifo_get time: 0 ms

    Display_get time: 0 ms,271 us

    Fifo_get time: 0 ms

    Display_get time: 3 ms,3036 us

    Fifo_get time: 0 ms

    Display_get time: 4 ms,4663 us

    Fifo_get time: 0 ms

    Display_get time: 5 ms,5209 us

    .....

    Display_get time: 31 ms,31360 us

    Fifo_get time: 22 ms

    Display_get time: 31 ms,31091 us

    Fifo_get time: 22 ms

    Display_get time: 31 ms,31296 us

    Fifo_get time: 22 ms

    Display_get time: 31 ms,31177 us

    Fifo_get time: 22 ms

     

    And then the display delay seriously (for 1~4 seconds even more). But when i let the camera target at a normal scene , the time print will back to 0ms , and display no delay.

     

    Anyboy who can tell me what's happen????

     

    Regards,

    Katee

  • Anybody, who can help me ??  Pls , any advice  will be appreciate !! 

  • Hi,

    This points to the fact that your input camera is taking longer exposure time in some conditions. I assume you are not doing any camera controls like exposure time through your DM36x software.

    One way to confirm it is that just probe the VD signals going into DM36x. If it shows longer VD interval during bright scenes, then the input device itself is doing longer exposures.

    Regards,

    Anshuman

    PS: Please mark this post as verified, if you think it has answered your questions. Thanks.

  • Hi, thank u for your help . 

    But i still little confused about  your answer , did u mean that if  my input camera is taking longer exposure time in some conditions and then  i can do nothing about my display delay problem (by change decode code) except  change  camera code ???

    Regards,

    Katee

  • Why the "Display_get(hDisplay, &hDstBuf)" cost more than 30ms ??

    In this function , the code just like below:

     

    Int Display_v4l2_get(Display_Handle hDisplay, Buffer_Handle *hBufPtr)

    {

        struct v4l2_buffer  v4l2buf;

     

        assert(hDisplay);

        assert(hBufPtr);

     

        Dmai_clear(v4l2buf);

        v4l2buf.type = V4L2_BUF_TYPE_VIDEO_OUTPUT;

        v4l2buf.memory = hDisplay->userAlloc ? V4L2_MEMORY_USERPTR :

                                               V4L2_MEMORY_MMAP;

     

        /* Get a frame buffer with captured data */

        if(ioctl(hDisplay->fd, VIDIOC_DQBUF, &v4l2buf) < 0) {

            Dmai_err1("VIDIOC_DQBUF failed (%s)\n", strerror(errno));

            return Dmai_EFAIL;

        }

     

        *hBufPtr = hDisplay->bufDescs[v4l2buf.index].hBuf;

        hDisplay->bufDescs[v4l2buf.index].used = TRUE;

     

        return Dmai_EOK;

    }

     

    When the display delay , the "Display_get time" is always change: 0ms --37ms(or more)

    Anybody who can tell me that in what condition that can let the "Display_get time" increase ??

    And i find that if i reduce the camera video (which i need to decode) frame rate 30fps to 25fps for 640x480 and 720x480,25fps to 20fps for 720x576 and the display no delay , "Display_get time" keep 0ms all the time. But that is not what i want .

    Does it mean that the decode capability is not good enough to handle high frame rate video ?? How can i improve it ?

    Is it decode's fault or display's fault or both ??

    I really need your help , pls..

     

    Regards,

    Katee