This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Best way to implement HDMI input

Dear TI experts,

Happy new year!

Could you please help me to choose a DaVinci processor which would fulfill my below requirements (in order of their priority):

- Video capture input port that is able to receive 1 single video stream from the output of a external HDMI receiver chip (up to 1080p 60fps);

- LCD output interface (e.g. LVDS);

- The latency between signal capturing and video playback on LCD required to be as small as possible (<10ms desired)

- Support of Android ICS very much needed;

What would be the smallest latency between the signal acquisition and displaying the video stream on the display? (only resizing of HDMI video to LCD resolution is needed and maybe some frame rotation or flip, other processing not required).

Thanks a lot for any kind of help!

  • Eugeniu,

    The best solution is DM8148. It can do most of the things that you expect. If you use slice mode for capture and display you'll be able to achieve close to 12-16ms latency, just from capture pipeline to display. It also supports LCD interface. LVDS conversion is not available on-chip. You can use an LVDS transmitter for that.

    If you can describe your use case in detail, the best possible latencies can be calculated easily. 

  • Hi Renjith,

    Thanks so much for feedback. I really had doubts anyone would answer my questions, as I am new to video.

    Basically, at this phase I am trying to conceptually build a portable wireless monitor for a video-camera, that would allow the live video preview coming from the camera to be displayed remotely with no wires and with no delay. This second aspect (zero delay) is of tremendous importance for a videographer or a video assistant, especially when the remote monitor is used for remote focus pulling (with the help of another wireless device - "follow focus"). As you can imagine, if the video stream is delayed on the remote monitor, your focus adjustments will be performed too late. Experimentally, I can say that 100ms is a very disturbing latency for a wireless focus puller.

    At this moment I have the uncompressed video stream transmitted wirelessly from the camera's hdmi output to the remote side with a latency of 1ms (yes, 1ms). I plan to convert this stream from hdmi tdms (up to 1080p 60fps) to parallel format using a simple hdmi receiver. The RGB data has to enter the Davinci video input (BT656/BT1120 format?) and starting with this moment it is important for me to know how much time is needed for the video feed to be displayed on LCD. 12-16ms sounds pretty acceptable, but still it is worse than the performance of a regular wired hdmi monitor (AFAIK its HW electronics is based either on specific LCD controllers SOCs with buffer-less architecture or on dedicated FPGA - in case of professional field monitors).

    To sum-up, I will list a few questions which are most interesting to me and it would be great if you can help me:

    - In order to support 1080p 60fps what should be the RGB format of the video stream entering the DM8148 video input (RGB888 with external syncs / BT656 / BT1120)?

    - In case BT.656 is supported, what is the required minimum bus width (8, 10, 12, 14, 16 bits)? (just to confirm that the hdmi receiver can output this format).

    - In case BT.1120 is supported, what is the required minimum bus width (8, 10, 12, 14, 16 bits)? (to confirm with hdmi receiver)

    - What would be the resulted clk frequency for 1080p 60fps capture use-case? (to confirm with hdmi receiver)

    - does  "capture-display" latency depend on data format captured? what would be the format introducing min latency?

    - Is there any way to lower the delay under 10ms?

    - What would be the DM8148 power consumption for a simple capture-display scenario?

    - Can DM8148 apply on-the-fly basic processing to a 1080p60 video stream (image scaling, flip, rotate). Would this imply degradation of latency? How much?

    - Can DM8148 apply some on-the-fly heavier processing to a 1080p60 video stream (monochrome, filters). Would this imply degradation of latency? How much?


    I really don't know if you are going to answer so many questions. At least I tried my luck to have this information that I simply do not hope to get on another forum.

    Happy new year!

    Eugeniu.

  • Eugeniu,

    I have tried to answer whatever I could. 

    Eugeniu Rosca said:

    - In order to support 1080p 60fps what should be the RGB format of the video stream entering the DM8148 video input (RGB888 with external syncs / BT656 / BT1120)?

    - In case BT.656 is supported, what is the required minimum bus width (8, 10, 12, 14, 16 bits)? (just to confirm that the hdmi receiver can output this format).

    - In case BT.1120 is supported, what is the required minimum bus width (8, 10, 12, 14, 16 bits)? (to confirm with hdmi receiver)

    There is an HDVPSS module available with DM8148. HDVPSS supports two capture interfaces VIP0 and VIP1. HDVPSS also supports display as well. VIP0 supports upto 24-bit data bus and it can support RGB888 with external syncs / BT656 / BT1120/BT601. Also various bus widths are supported 8-16 bit is supported. If your decoder supports 24-bit RGB, that will be sufficient I guess. 

    Eugeniu Rosca said:

    - What would be the resulted clk frequency for 1080p 60fps capture use-case? (to confirm with hdmi receiver)

    Up to 1920x1200@60 Hz (160 MHz) input data rate supports 16-bit mode input port.

    Eugeniu Rosca said:

    - does  "capture-display" latency depend on data format captured? what would be the format introducing min latency?

    No, if the VIP supported color formats(RGB/YUV420T/422P) are used there won't be any additional latency. 

    Eugeniu Rosca said:

    - Is there any way to lower the delay under 10ms?

    If slice mode capture/display is enabled, then assuming you have 4 slices per frame, each slice capture will take 4ms and display will take another 4ms. If everything is syncs quite well, you can achieve less than 10ms. 

    Eugeniu Rosca said:

    - What would be the DM8148 power consumption for a simple capture-display scenario?

    I'm don't have the exact data points. I believe this is also available somewhere. Is your device battery powered?

    Eugeniu Rosca said:

    - Can DM8148 apply on-the-fly basic processing to a 1080p60 video stream (image scaling, flip, rotate). Would this imply degradation of latency? How much?

    The VIP0 port has a color space converter, scalar and a chroma down sampler unit inside. You can perform these operations on the fly during capture itself. You can capture and scale down the video according to the LCD panel resolution in the beginning. Rotation and mirroring is not directly supported by HDVPSS. It can be done using a hardware unit called Tiler. Using tiler you can get a view of memory in rotated (90/180/270) and mirrored from HDVPSS.

    Eugeniu Rosca said:

    - Can DM8148 apply some on-the-fly heavier processing to a 1080p60 video stream (monochrome, filters). Would this imply degradation of latency? How much?

    Yes this will introduce latency. If slice mode is supported per slice it will take 4ms if 4 slices are there.

  • Dear Renjith,

    Kindly appreciate your effort to provide the answers!

    Yes, my device is thought to be battery powered. The DM814x power consumption document returns about 800mW for the 1080p capture use-case (CPU idle) and about 1.5-2W with CPU load included. That's probably ok, but could be better.

    Regarding the slices you are speaking about, what would be the max number of slices a frame can be divided in for video capture?

    I see you are speaking about 4 slices. I understand that in a 1080p60 scenario, the duration of a frame is roughly 16ms. By dividing the time of the frame (16ms) to the number of slices (4), the duration of a slice would be 4ms. And it is acceptable to me. However, there could be also 1080p30/24 use-cases. In case of 30fps, the slice duration is doubled (8ms). So, I am curios if the number of slices could be increased.

    Thank you!

  • Hi,

    Sliced based display is not supported, it is supported only on capture and mem2mem components. Also if you use components like scalar or chroma up sampler, you will need to provide extra lines from top or bottom slices, please refer to the HDVPSS user guide to get more details on this..

    If i remember correctly, you could have at max 16 slices in the capture.

    Thanks,

    Brijesh Jadav

  • Brijesh,

    I have a question. Is it like, display slice mode support is not possible to achieve or is it currently not implemented? 

  • Hi,

    For display its practically impossible to implement slice mode.

  • Hi, Renjith,

    It is not possible because of the hardware + software restrictions. Also the input is coming from the use, so need to take care of lot of things like if input gets delayed, what to do? This is why it is not implemented in display. Leave sliced based, driver does not even support field based display for interlaced display..

    Thanks,

    Brijesh Jadav

  • Guys, thanks a lot for your support!

    Unfortunately, I think I will not be able to get a negligible capture-display delay until I stick to a FPGA solution for my video application. Or one more possibility is to use a regular LCD controller SOC, which accepts HDMI input and outputs RGB/LVDS for the connected LCD (regular desktop monitor solution). However, what is also desired in my application is to overlay different image flows, to provide a pleasant GUI (sort of Android feel) which is not possible by using a regular monitor LCD controller. Battery life would also be bad in this case (as these ICs are not thought for portable use).

    So, I will have to think more. Many thanks for all your answers! TI community is really a great place for engineering talks.

    Best regards.