This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

AM62A7: How to write overlay to VIDL Pipeline in DSS

Part Number: AM62A7
Other Parts Discussed in Thread: AM62P

Tool/software:

Hi,

I want to stream a video on the VID Pipeline and overlay like (Resolution, Recording, Few others) as a overlay to video. 

I don't want my stream to lag at any cost because of this overlay computations.
I came across the DSS in TRM which has overlay manager and two different pipelines (VID and VIDL)
https://software-dl.ti.com/processor-sdk-linux/esd/AM62AX/10_00_00/exports/docs/linux/Foundational_Components/Kernel/Kernel_Drivers/Display/DSS7.html

How to stream my main video to VID pipeline and how to stream overlay to VIDL pipeline?
Let's say my video pipeline is below and my overlay pipeline is another test source. 

gst-launch-1.0 videotestsrc pattern=snow ! video/x-raw,format=UYVY,width=1920,height=1080 ! kmssink driver-name=tidss

How to stream in overlay pipeline?

Note:
I am using Processor-SDK-Linux 9.2.0

  • Hi,

    A colleague of mine has tried the following:

    On AM62A, I tried this below pipeline:

    Pipeine:

    st-launch-1.0 -v videotestsrc ! video/x-raw, width=1920, height=1080, framerate=30/1, format=NV12 ! queue ! gdkpixbufoverlay location=planets.png offset-x=400 offset-y=400 ! queue ! clockoverlay time-format="%D %H:%M:%S" ! queue ! kmssink driver-name=tidss force-modesetting=true sync=false

    Attached is the planets.png that I have referenced in the pipeline

    Screenshot on from display after running the pipeline:

    Regards,
    Krunal

  • Hi Krunal,

    I understand gdkpixbufoverlay gives us an image overlay. I wanted to make sure it really access the VIDL pipeline of the display sub system or else gstreamer is doing cpu based rendering over the video and pushing to the buffer.

    Because my end application is kind of different. Which creates a frames with lot of text information in CV and plays. I am using appsrc for that and for now I am using kmssink itself but I can't play any other video in another plane. To replicate this situation, you can try two videotestsrc with different pattern and consider one for overlay.

    And also, I wanted to ask, how does information of transparency of a pixel is stored?

  • Hi,

    Based on my internal discussion, it does use CPU. I am not aware of any gstreamer plugins for overlays. Typically, we have the following example: https://software-dl.ti.com/processor-sdk-linux/esd/AM62PX/10_01_10_04/exports/docs/system/Demo_User_Guides/Display_Cluster_User_Guide.html. In this example, we have R5 core writing tell tales and A53 performing 3D rendering. Both contents are sent to the display and the overlay managers combines both the framebuffers. Don't have such examples using Gstreamer.

    Regards,
    Krunal

  • Hi Krunal,

    I don't have AM62P board to test that display cluster demo. Will it be applicable to AM62A also?

    And also, I want to know how information of transparency of a pixel is handled in case of AM62A?

  • Yes, the same concept applies since the DSS controller is identical. Could you please elaborate on the transparency request?

    Regards,
    Krunal 

  • Could you please elaborate on the transparency request?

    Basically, the pixel will be sent as UYVY or RGB format in the DPI lines, right?
    And there is no information of transparency in these two formats.

    But before sending the pixel, the overlay manager renders the combined color value along with the transparency also.
    You can try this by running edgeai-gui-app in the background and gstreamer test command in the terminal

    gst-launch-1.0 videotestsrc pattern=solid-color foreground-color=0x80FF0000 ! video/x-raw,height=1080,width=1920,format=BGRA,pixel-aspect-ratio=16/9 ! kmssink driver-name=tidss sync=false
    
    Where,
     #80FF0000 -> ARGB format

    This will give you red transparent overlay with alpha of 50%. You can try half the width and play with alpha value to see the difference.

    Now, my question is, how this information of transparency is given to the overlay in term of bytes? Is it direct 32 bytes (R-8, B-8, G-8, A-8) of information for 1 pixel? or else RGB as 24 bytes and transparency information is fed differently?

  • Hi Krunal,

    FYI, Whenever I am using following pipeline, I can be able to stream on particular plane but never both

    # FOR PLANE 31
    gst-launch-1.0 videotestsrc pattern=snow ! video/x-raw,format=UYVY,width=1920,height=1080 ! kmssink driver-name=tidss sync=false plane-id=31
    
    # FOR PLANE 41
    gst-launch-1.0 videotestsrc pattern=snow ! video/x-raw,format=UYVY,width=1920,height=1080 ! kmssink driver-name=tidss sync=false plane-id=41

  • Now, my question is, how this information of transparency is given to the overlay in term of bytes? Is it direct 32 bytes (R-8, B-8, G-8, A-8) of information for 1 pixel? or else RGB as 24 bytes and transparency information is fed differently?

    With respect to this question, please look at the input and output formats of the following flow diagram:

    Each pipeline output is connected to individual Overlay Managers.

    Quoting the following from TRM:
    "

    For ARGB source data with less than/equal to 10-bit component data size the replication logic (ARGB expansion) converts the data to ARGB48 by replicating the MSBs into the LSBs:

    • When scaling is disabled (or no scaler supported), the resulting ARGB48 data is directly provided to the pipeline output;

    • When vertical scaling is engaged, the resulting ARGB48 data is first truncated to ARGB8888, and then converted to ARGB10101010 (by MSBs replication into LSBs), before being fed to the vertical scaler input;

    • When vertical scaling is disabled, but horizontal scaling is engaged, the resulting ARGB48 data is directly provided to the horizontal scaler input;

    "

    Above is replicated according to input format as per this: 

  • FYI, Whenever I am using following pipeline, I can be able to stream on particular plane but never both

    We don't have an example showing this using GStreamer, but we have C code example using modetest. You may try something like following:

    root@am62pxx-evm:~# systemctl stop weston
    .
    .
    .
    root@am62pxx-evm:~# kmsprint
    Connector 0 (40) LVDS-1 (connected)
      Encoder 0 (39) NONE
        Crtc 0 (38) 1920x1200@60.00 150.275 1920/32/52/24/? 1200/24/8/3/? 60 (60.00) 0x0 0x48
          Plane 0 (31) fb-id: 55 (crtcs: 0 1) 0,0 1920x1200 -> 0,0 1920x1200 (AR12 AB12 RA12 RG16 BG16 AR15 AB15 AR24 AB24 RA24 BA24 RG24 BG24 AR30 AB30 XR12 XB12 RX12 XR15 XB15 XR24 XB24 RX24 BX24 XR30 XB30 YUYV UY
    VY NV12)
            FB 55 1920x1200
    Connector 1 (50) HDMI-A-1 (connected)
      Encoder 1 (49) NONE
        Crtc 1 (48) 1920x1080@59.93 138.500 1920/48/32/80/+ 1080/3/5/23/- 60 (59.93) 0x9 0x48
          Plane 1 (41) fb-id: 56 (crtcs: 0 1) 0,0 1920x1080 -> 0,0 1920x1080 (AR12 AB12 RA12 RG16 BG16 AR15 AB15 AR24 AB24 RA24 BA24 RG24 BG24 AR30 AB30 XR12 XB12 RX12 XR15 XB15 XR24 XB24 RX24 BX24 XR30 XB30 YUYV UY
    VY NV12)
            FB 56 1920x1080
    root@am62pxx-evm:~# 
    root@am62pxx-evm:~# 
    root@am62pxx-evm:~# 
    root@am62pxx-evm:~# modetest -M tidss -s 50@48:1920x1080 -s 40@38:1920x1200

  • Hi Divyansh,

    The output you have shown uses two different displays. But in my case, I have to access both planes while my single display connected (HDMI-A-1).

  • Hi,

    I have achieved streaming to different overlay components (VID Pipeline and VIDL Pipeline) at the same time. Here is the detailed information.

    VID Pipeline

    Whenever normally streaming is happening through kmssink, it uses VID Pipeline (in my case: Plane 41) [Refer Here]. To stream on this pipeline, you can use following gstreamer command.

    ## GST COMMAND
    gst-launch-1.0 videotestsrc pattern=snow ! video/x-raw,format=UYVY,width=1920,height=1080 ! kmssink driver-name=tidss sync=false
    
    ## GST COMMAND WITH SPECIFIC PLANE
    #  though both does the same, it will be better to specify plane when using two pipelines
    gst-launch-1.0 videotestsrc pattern=snow ! video/x-raw,format=UYVY,width=1920,height=1080 ! kmssink driver-name=tidss sync=false plane-id=41

    VIDL Pipeline

    The tidss driver is creating a virtual framebuffer which is normally used to run the GUI stuffs. This virtual framebuffer is connected to VIDL Pipeline (in my case: Plane 31). To stream on this pipeline, you can use following gstreamer command.

    gst-launch-1.0 videotestsrc pattern=ball ! video/x-raw,format=BGRx,width=1920,height=1080 ! fbdevsink sync=false

    Important Note

    1. Plane Position

    You can configure the which plane should overlay other. As framebuffer's sync capabilities are not that good, it is better to use framebuffer for GUI.
    You can configure the overlapping of plane using zpos property of the plane. [Refer]

    modetest -M tidss -w 31:zpos:1 -w 41:zpos:0
    
    # In tisdk-edgeai-image, there is a init script (/opt/edgeai-gst-apps/init_script.sh) running to change the z position. Either remove the modetest line or apply your changes there

    2. Transparency

    Framebuffer currently can't be configured for pixel wise transparency. In order to make the GUI Transparent, you can use alpha property of the plane. It will be applied to the whole plane

    modetest -M tidss -w 31:alpha:6553  
    # The above command sets 10% alpha
    # values range from 0 to 65535

  • Hi Divyansh and Krunal,

    Though I achieved what I asked in the question, I need further support in pixel wise transparency of framebuffer.

    Should I continue the discussion in same thread? or Should I open new thread?

  • It is better to start a new one since your query is different from this thread subject.