This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

PROCESSOR-SDK-AM68A: Edgeai-gst-apps: remote display not working

Part Number: PROCESSOR-SDK-AM68A
Other Parts Discussed in Thread: SK-AM68

Hi,

I am working on the remote display demo from edgeai-gst-apps from PSDKL (v8.6). I am following this guideline to run the demo,

The remote_display.yaml file configuration is,

root@j721s2-evm:/opt/edgeai-gst-apps# cat ./configs/remote_display.yaml
title: "Remote Display"
# If output is set to display, it runs the pipeline with udpsink as the output
# To view the output on web browser, run the streamlit server using
# root@soc:/opt/edgeai-gst-apps> streamlit run scripts/udp_vis.py -- --port *port_number* [Default is 8081]
# This will start streamlit webserver and generate a link which you can open in browser
log_level: 2
inputs:
    input0:
        source: /dev/video2
        format: h264
        width: 640
        height: 480
        framerate: 30
    input1:
        source: /opt/edgeai-test-data/videos/video_0000_h264.h264
        format: h264
        width: 1280
        height: 720
        framerate: 30
        loop: True
    input2:
        source: /opt/edgeai-test-data/images/%04d.jpg
        width: 640
        height: 480
        index: 0
        framerate: 1
        loop: True
models:
    model0:
        model_path: /opt/model_zoo/ONR-SS-8610-deeplabv3lite-mobv2-ade20k32-512x512
        alpha: 0.4
    model1:
        model_path: /opt/model_zoo/TFL-OD-2010-ssd-mobV2-coco-mlperf-300x300
        viz_threshold: 0.6
    model2:
        model_path: /opt/model_zoo/TFL-CL-0000-mobileNetV1-mlperf
        topN: 5

outputs:
    output0:
        sink: remote
        width: 640
        height: 480
        port: 8081
        host: 0.0.0.0
flows:
    flow0: [input0,model1,output0]

With Gst:

By running the demo with gst, I used this command as mentioned in the guideline https://software-dl.ti.com/jacinto7/esd/processor-sdk-linux-edgeai/AM68A/08_06_01/exports/docs/common/configuration_file.html

I get the following output and can see nothing on the remote PC,

sudo gst-launch-1.0 udpsrc port=8081 ! application/x-rtp,encoding=H264 ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock

With Streamlit:

Upon running the streamlit server and opening the web browser with specified port number (8081),

root@j721s2-evm:/opt/edgeai-gst-apps# streamlit run scripts/udp_vis.py -- --port 8081         

(streamlit:1765): Gdk-CRITICAL **: 08:47:53.424: gdk_cursor_new_for_display: assertion 'GDK_IS_DISPLAY (display)' failed

Collecting usage statistics. To deactivate, set browser.gatherUsageStats to False.


  You can now view your Streamlit app in your browser.

  Network URL: http://XX.XX.XX.XX:8501
  External URL: http://XX.XX.XX.XX:8501

Listening to port 8081 for jpeg frames
Starting GST Pipeline...
Listening to port 8081 for jpeg frames
Starting GST Pipeline...

I see nothing on the web browser, while, the remote display demo is running in the background giving logs,

Is there any step that I missed?

Thanks,

Ahmed

  • Hi Ahmed,

    What camera sensor are you using for this? The issue may be with the input format being set to h264 for your camera source. Based on the type of camera you're using, you will need to configure your input accordingly. See the section for camera sources for more details: https://software-dl.ti.com/jacinto7/esd/processor-sdk-linux-edgeai/AM68A/08_06_01/exports/docs/common/configuration_file.html#camera-sources-v4l2

    Thank you,

    Fabiana

  • Hi Fabiana,

    What camera sensor are you using for this?

    I am using USB camera. I configured the format to "jpeg" in the "remote_display.yaml" file. Still it does not work.

    The input and output gst pipeline after setting the above-mentioned configuration is,

    root@j721s2-evm:/opt/edgeai-gst-apps# ./apps_python/app_edgeai.py ./configs/remote_display.yaml 
    
     Number of subgraphs:1 , 34 nodes delegated out of 34 nodes 
     
    APP: Init ... !!!
    MEM: Init ... !!!
    MEM: Initialized DMA HEAP (fd=5) !!!
    MEM: Init ... Done !!!
    IPC: Init ... !!!
    IPC: Init ... Done !!!
    REMOTE_SERVICE: Init ... !!!
    REMOTE_SERVICE: Init ... Done !!!
      1296.597003 s: GTC Frequency = 200 MHz
    APP: Init ... Done !!!
      1296.597309 s:  VX_ZONE_INIT:Enabled
      1296.597414 s:  VX_ZONE_ERROR:Enabled
      1296.597492 s:  VX_ZONE_WARNING:Enabled
      1296.598329 s:  VX_ZONE_INIT:[tivxInitLocal:130] Initialization Done !!!
      1296.599605 s:  VX_ZONE_INIT:[tivxHostInitLocal:93] Initialization Done for HOST !!!
    ==========[INPUT PIPELINE(S)]==========
    
    [PIPE-0]
    
    v4l2src device=/dev/video2 brightness=128 contrast=128 saturation=128 ! capsfilter caps="image/jpeg, width=(int)640, height=(int)480;" ! jpegdec ! tiovxdlcolorconvert ! capsfilter caps="video/x-raw, format=(string)NV12;" ! tiovxmultiscaler name=split_01
    split_01. ! queue ! capsfilter caps="video/x-raw, width=(int)640, height=(int)480;" ! tiovxdlcolorconvert out-pool-size=4 ! capsfilter caps="video/x-raw, format=(string)RGB;" ! appsink max-buffers=2 drop=True name=sen_0
    split_01. ! queue ! capsfilter caps="video/x-raw, width=(int)340, height=(int)256;" ! tiovxdlcolorconvert out-pool-size=4 ! capsfilter caps="video/x-raw, format=(string)RGB;" ! videobox qos=True left=58 right=58 top=16 bottom=16 ! tiovxdlpreproc out-pool-size=4 channel-order=1 data-type=3 ! capsfilter caps="application/x-tensor-tiovx;" ! appsink max-buffers=2 drop=True name=pre_0
    
    
    ==========[OUTPUT PIPELINE]==========
    
    appsrc do-timestamp=True format=3 block=True name=post_0 ! tiovxdlcolorconvert ! capsfilter caps="video/x-raw, format=(string)NV12, width=(int)640, height=(int)480;" ! v4l2h264enc bitrate=10000000 gop-size=30 ! h264parse ! rtph264pay ! udpsink sync=False clients=0.0.0.0:8081 host=0.0.0.0 port=8081
    
      1317.272502 s:  VX_ZONE_INIT:[tivxHostDeInitLocal:107] De-Initialization Done for HOST !!!
      1317.277097 s:  VX_ZONE_INIT:[tivxDeInitLocal:193] De-Initialization Done !!!
    APP: Deinit ... !!!
    REMOTE_SERVICE: Deinit ... !!!
    REMOTE_SERVICE: Deinit ... Done !!!
    IPC: Deinit ... !!!
    IPC: DeInit ... Done !!!
    MEM: Deinit ... !!!
    DDR_SHARED_MEM: Alloc's: 55 alloc's of 22240962 bytes 
    DDR_SHARED_MEM: Free's : 55 free's  of 22240962 bytes 
    DDR_SHARED_MEM: Open's : 0 allocs  of 0 bytes 
    DDR_SHARED_MEM: Total size: 536870912 bytes 
    MEM: Deinit ... Done !!!
    APP: Deinit ... Done !!!

    It seems that the output pipeline is encoding the video to h264 format, parses it, and send it to the RTP payload-encapsulator for H.264 streams.

    The client IP is kind of confusing in here as the host is the TI board.

    Thanks,

    Ahmed

  • Hi Ahmed,

    Is the board and PC connected to the same network?

    This method of streaming remotely was changed after SDK v8.6. The other day I used the remote_display config file and an IMX219 sensor to stream remotely to my PC and experienced no issues. If possible, I recommend updating to the latest SDK. If you would rather stay on SDK 8.6, I can try this on my end tomorrow using that version and debug. My SK-AM68 is being used for a demo for the day but I will share is my configuration file as an example on another board using the Edge AI stack on the latest SDK to stream IMX219 to my PC:

    root@j722s-evm:/opt/edgeai-gst-apps# cat configs/remote_display.yaml
    title: "Remote Display"
    # If output is set to remote, it runs the pipeline with udpsink as the output
    # To view the output on web browser, run the node server
    # root@soc:/opt/edgeai-gst-apps> node scripts/remote_streaming/server.js
    # This will start node webserver and generate a link which you can open in browser
    log_level: 2
    inputs:
        input0:
            source: /dev/video-usb-cam0
            format: jpeg
            width: 1280
            height: 720
            framerate: 30
        input1:
            source: /opt/edgeai-test-data/videos/video0_1280_768.h264
            format: h264
            width: 1280
            height: 768
            framerate: 30
            loop: True
        input2:
            source: /opt/edgeai-test-data/images/%04d.jpg
            width: 1280
            height: 720
            index: 0
            framerate: 1
            loop: True
        input3:
            source: /dev/video-imx219-cam0
            subdev-id: /dev/v4l-imx219-subdev0
            width: 1920
            height: 1080
            format: rggb
            framerate: 30
    models:
        model0:
            model_path: /opt/model_zoo/TFL-CL-0000-mobileNetV1-mlperf
            topN: 5
        model1:
            model_path: /opt/model_zoo/ONR-OD-8200-yolox-nano-lite-mmdet-coco-416x416
            viz_threshold: 0.6
        model2:
            model_path: /opt/model_zoo/ONR-SS-8610-deeplabv3lite-mobv2-ade20k32-512x512
            alpha: 0.4
    outputs:
        # Jpeg encode and stream
        output0:
            sink: remote
            width: 1280
            height: 720
            port: 8081
            host: 127.0.0.1
            encoding: jpeg
            overlay-perf-type: graph
        # mp4 encode and stream
        output1:
            sink: remote
            width: 1280
            height: 720
            port: 8081
            host: 127.0.0.1
            encoding: mp4
            bitrate: 1000000
            overlay-perf-type: graph
    
    flows:
        flow0: [input3,model1,output0]

    Run the application:

    root@j722s-evm:/opt/edgeai-gst-apps# ./apps_python/app_edgeai.py ./configs/remote_display.yaml

    Run node server to view the stream on your web browser:

    root@j722s-evm:/opt/edgeai-gst-apps# node scripts/remote_streaming/server.js

    Finally, you can view the JPEG encoded frames at the web page provided.

    See the 9.2 SDK for more details on this: https://software-dl.ti.com/jacinto7/esd/processor-sdk-linux-am68a/09_02_00/exports/edgeai-docs/common/configuration_file.html#remote-sinks

    Thank you,

    Fabiana

  • Hi Fabiana,

    Is the board and PC connected to the same network?

    Yes, I can ping the PC from the board and vice versa.

    Unfortunately, I do not have a IMX219 camera sensor and SK-AM68 to test the above config. I am using a USB camera on J721s2 EVM Board.

    I tried testing the remote display demo with latest v9.2 SDK. However, the latest SDK (v9.2) for PROCESSOR-SDK-J721S2 does not have a pre-built image for edgeai. It contains pre-built image for adas (tisdk-adas-image-j721s2-evm.tar.xz) that has "edgeai-tiovx-apps" and "vision_apps". Infact, all the 9th version SDK for J721S2 has pre-built adas image. 

    While v8.6 SDK for J721S2 has pre-built edgeai image that contains "edgeai-gst-apps". That's why I am using it. Moreover, the v8.6 SDK does not have "server.js" in "/opt/edgeai-gst-apps/scripts/remote_streaming/server.js". 

    Therefore, I downloaded the script from edgeai-gst-apps and run on the board with v8.6 SDK edgeai image.

    The remote_display.yaml file is,

    title: "Remote Display"
    # If output is set to display, it runs the pipeline with udpsink as the output
    # To view the output on web browser, run the streamlit server using
    # root@soc:/opt/edgeai-gst-apps> streamlit run scripts/udp_vis.py -- --port *port_number* [Default is 8081]
    # This will start streamlit webserver and generate a link which you can open in browser
    log_level: 2
    inputs:
        input0:
            source: /dev/video2
            format: jpeg
            width: 640
            height: 480
            framerate: 30
        input1:
            source: /opt/edgeai-test-data/videos/video_0000_h264.h264
            format: h264
            width: 1280
            height: 720
            framerate: 30
            loop: True
        input2:
            source: /opt/edgeai-test-data/images/%04d.jpg
            width: 640
            height: 480
            index: 0
            framerate: 1
            loop: True
    models:
        model0:
            model_path: /opt/model_zoo/ONR-SS-8610-deeplabv3lite-mobv2-ade20k32-512x512
            alpha: 0.4
        model1:
            model_path: /opt/model_zoo/TFL-OD-2010-ssd-mobV2-coco-mlperf-300x300
            viz_threshold: 0.6
        model2:
            model_path: /opt/model_zoo/TFL-CL-0000-mobileNetV1-mlperf
            topN: 5
    
    outputs:
        output0:
            sink: remote
            width: 640
            height: 480
            port: 8081
            host: 0.0.0.0
            encoding: jpeg
    flows:
        flow0: [input0,model0,output0]
    

    The logs from the server.js upon accessing the url,

    root@j721s2-evm:/opt/edgeai-gst-apps# node scripts/remote_streaming/server.js 
    
    View Jpeg encoded frames at http://XX.XX.XX.XX:8080/jpeg
    
    View H264 encoded frames at http://XX.XX.XX.XX:8080/mp4
    
    UDP Server listening at port 8081
    undefined
    returning...

    It would be better if I have guidelines to run the remote display demo using streamlit on v8.6 SDK as specified here. https://software-dl.ti.com/jacinto7/esd/processor-sdk-linux-edgeai/AM68A/08_06_01/exports/docs/common/configuration_file.html#:~:text=5.3.4.-,Remote%20sinks,-The%20JPEG%20compressed

     

    Thanks,

    Ahmed

  • Hi Ahmed,

    IMX219 sensor is not required and was used as an example. Simply change your input configuration to be a USB camera. 

    The SDK 9.2 Edge AI image you are looking for can be found here: PROCESSOR-SDK-LINUX-AM68A

    Please try this and let me know if you run into any issues.

    Thank you,

    Fabiana

  • Hi Fabiana,

    I tried flashing the SD card with "tisdk-edgeai-image-j721s2-evm.tar.xz" provided in the "ti-processor-sdk-linux-edgeai-j721s2-evm-09_02_00_05/filesystem" on the PROCESSOR-SDK-LINUX-AM68A. After flashing the SD card, I was not able to boot the J721S2 EVM with it.

    Nevertheless, I found one solution.

    I copied the image from PROCESSOR-SDK-LINUX-AM68A (tisdk-edgeai-image-j721s2-evm.tar.xz) to PROCESSOR-SDK-LINUX-J721S2 v9.2 for ADAS ("ti-processor-sdk-linux-adas-j721s2-evm-09_02_00_05/filesystem/" directory). Then, I ran the "create-sdcard.sh" script to flash the "tisdk-edgeai-image-j721s2-evm.tar.xz" image. The "create-sdcard.sh" script automatically detected the "tisdk-edgeai-image-j721s2-evm.tar.xz" image in the filesystem. Before booting, I changed the dtbo file in uEnv.txt to point to "k3-j721s2-edgeai-apps.dtbo" like this,

    # This uEnv.txt file can contain additional environment settings that you
    # want to set in U-Boot at boot time.  This can be simple variables such
    # as the serverip or custom variables.  The format of this file is:
    #    variable=value
    # NOTE: This file will be evaluated after the bootcmd is run and the
    #       bootcmd must be set to load this file if it exists (this is the
    #       default on all newer U-Boot images.  This also means that some
    #       variables such as bootdelay cannot be changed by this file since
    #       it is not evaluated until the bootcmd is run.
    
    # Update the Linux hostname based on board_name
    # The SK also requires an additional dtbo to boot. Prepend it to name_overlays depending on board_name
    uenvcmd=if test ${boot_fit} -eq 1; then  setenv name_overlays $name_overlays_fit; fi; if test "$board_name" = "am68-sk"; then ; setenv args_all $args_all systemd.hostname=am68a-sk ; if test ${boot_fit} -eq 1; then setenv name_overlays ti/k3-am68-sk-som-ddr-mem-carveout.dtbo $name_overlays; else; setenv name_overlays k3-am68-sk-som-ddr-mem-carveout.dtbo $name_overlays ; fi; fi
    
    # Setting the right U-Boot environment variables
    dorprocboot=1
    name_overlays=k3-j721s2-edgeai-apps.dtbo
    
    # Name overlays when booting from fit image
    name_overlays_fit=ti/k3-j721s2-edgeai-apps.dtbo

    And it worked! I can now successfully run the EdgeAI v9.2 image on the J721S2 EVM. I tried tested the remote display demo with it by using the node "server.js" script, and it worked seamlessly.

    However, I am still curious about the remote display demo using GStreamer and streamlit with v8.6 EdgeAI SDK,

    Thanks,

    Ahmed

  • Hi Ahmed,

    I'm glad you found a work around! As far as the remote_display application on SDK v8.6, have you tried replacing the host field in your input from 0.0.0.0 to your PC's IP address? For a USB camera, this was the only change I made to remote_display.yaml.

    output3:
        sink: remote
        width: 1280
        height: 720
        port: 8081
        host: 0.0.0.0 #IP of Remote PC.

    On the PC, I ran the following:

    gst-launch-1.0 udpsrc port=8081 ! application/x-rtp,encoding=H264 ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink

    A window with a real time stream from the USB camera connected to the board opened after running the previous command.

    Thank you,

    Fabiana

  • Hi Fabiana,

    It works!

    I did the following changes,

    1. I changed the IP address to my remote PC in my config file.
    2. I disabled the firewall in linux (sudo ufw disable). The firewall was blocking the udp stream.
    3. [Optional] In some cases, I encountered that the if there is conda enabled in your "bashrc" file, then deactivate it because it also causes issues with gst-launch udpsrc.

    Closing this thread with thanks!

    Cheers,

    Ahmed