This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

big delay difference between 720p and 576i with h.264 codec

Hello everybody,

when measuring the delay between "video in" with h.264 encoder app on one evmdm365 board and "video out" with h.264 decoder app on another evmdm365 board I get very different delay times with different video settings. With 720p I have about 150 msec delay between video in and video out, but with 576i I have 500 msec and more! The structures of the apps are like the demo apps in dvsdk 4.02. But instead of writing the encoded data to a file I transmit the data to the decoder board. I use the same settings for the codec (e.g. videobitrate), so the amount of data should be the same. I work with the same content type (IVIDEO_PROGRESSIVE) for both 720p and 576i. Is it better to change the content type for 576i to IVIDEO_INTERLACED? I will test it but the apps needed to be updated to the other content type first. My codec versions are 2.30.00.01 for h.264-encoder and 2.00.00.13 for h.264-decoder. Thank you for your answers or suggestions!

Regards,

Matthias

  • Is it even possible to use the interlaced video input? I use DMAI for video capture. In the capture module I can set the video standard to VideoStd_D1_PAL, but I think, the capture driver creates a video buffer with the complete frame (progressive) at a frame rate of 25 Hz. With interlaced content type I should need a buffer from the capture driver with either the first field or the second field at a "field" rate of 50 Hz. Should I use VideoStd_576P_50 instead, but that would be a progressive scan with 50 Hz?

    Regards,

    Matthias