This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

SK-TDA4VM: Streaming using rtpsession

Part Number: SK-TDA4VM
Other Parts Discussed in Thread: TDA4VM

Hello,

I am trying to stream IMX219 camera picture from the TDA4VM to my PC with the following commands:
Host:
gst-launch-1.0 v4l2src device=/dev/video-imx219-cam0 ! queue leaky=2 ! video/x-bayer, width=1920, height=1080, framerate=30/1, format=rggb ! tiovxisp sink_0::device=/dev/v4l-subdev2 sensor-name="SENSOR_SONY_IMX219_RPI" dcc-isp-file=/opt/imaging/imx219/linear/dcc_viss_1920x1080.bin sink_0::dcc-2a-file=/opt/imaging/imx219/linear/dcc_2a_1920x1080.bin format-msb=7 ! tiovxmultiscaler ! video/x-raw, width=1920, height=1080, framerate=30/1 ! videoconvert ! v4l2h264enc extra-controls="controls, video_bitrate=999999" ! h264parse ! rtph264pay ! udpsink host=<MY_PC_IP_ADDR> port=8801 sync=false async=false

Client
gst-launch-1.0 udpsrc port=8801 ! application/x-rtp,encoding=H264 ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink
No window appears. The output on the PC is the following:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Got context from element 'autovideosink0': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"\(GstGLDisplayWayland\)\ gldisplaywayland0";
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Any help is appreciated!
Regards,
Tamas
  • Hi Tamas,

    The easiest way to achieve remote streaming from your imx219 to your PC would be to use the remote_streaming.yaml configuration file. For an imx219 input, the file would have to be changed a bit to look like this:

    root@j722s-evm:/opt/edgeai-gst-apps# cat configs/remote_display.yaml
    title: "Remote Display"
    # If output is set to remote, it runs the pipeline with udpsink as the output
    # To view the output on web browser, run the node server
    # root@soc:/opt/edgeai-gst-apps> node scripts/remote_streaming/server.js
    # This will start node webserver and generate a link which you can open in browser
    log_level: 2
    inputs:
        input0:
            source: /dev/video-usb-cam0
            format: jpeg
            width: 1280
            height: 720
            framerate: 30
        input1:
            source: /opt/edgeai-test-data/videos/video0_1280_768.h264
            format: h264
            width: 1280
            height: 768
            framerate: 30
            loop: True
        input2:
            source: /opt/edgeai-test-data/images/%04d.jpg
            width: 1280
            height: 720
            index: 0
            framerate: 1
            loop: True
        input3:
            source: /dev/video-imx219-cam0
            subdev-id: /dev/v4l-imx219-subdev0
            width: 1920
            height: 1080
            format: rggb
            framerate: 30
    models:
        model0:
            model_path: /opt/model_zoo/TFL-CL-0000-mobileNetV1-mlperf
            topN: 5
        model1:
            model_path: /opt/model_zoo/ONR-OD-8200-yolox-nano-lite-mmdet-coco-416x416
            viz_threshold: 0.6
        model2:
            model_path: /opt/model_zoo/ONR-SS-8610-deeplabv3lite-mobv2-ade20k32-512x512
            alpha: 0.4
    outputs:
        # Jpeg encode and stream
        output0:
            sink: remote
            width: 1280
            height: 720
            port: 8081
            host: 127.0.0.1
            encoding: jpeg
            overlay-perf-type: graph
        # mp4 encode and stream
        output1:
            sink: remote
            width: 1280
            height: 720
            port: 8081
            host: 127.0.0.1
            encoding: mp4
            bitrate: 1000000
            overlay-perf-type: graph
    
    flows:
        flow0: [input3,model1,output0]

    Run the application:

    root@j722s-evm:/opt/edgeai-gst-apps# ./apps_python/app_edgeai.py ./configs/remote_display.yaml

    Run node server to view the stream on your web browser:

    root@j722s-evm:/opt/edgeai-gst-apps# node scripts/remote_streaming/server.js

    View the JPEG encoded frames at the web page provided.

    To see the generated GStreamer pipeline, exit the application with Ctrl-C and scroll up on your terminal and you'll see the input and output pipelines like such:

    ==========[INPUT PIPELINE(S)]==========
    
    [PIPE-0]
    
    v4l2src device=/dev/video-imx219-cam0 io-mode=5 pixel-aspect-ratio=None ! queue leaky=2 ! capsfilter caps="video/x-bayer, width=(int)1920, height=(int)1080, format=(string)rggb;" ! tiovxisp dcc-isp-file=/opt/imaging/imx219/linear/dcc_viss.bin sensor-name=SENSOR_SONY_IMX219_RPI ! capsfilter caps="video/x-raw, format=(string)NV12;" ! tiovxmultiscaler name=split_01
    split_01. ! queue ! capsfilter caps="video/x-raw, width=(int)1280, height=(int)720;" ! tiovxdlcolorconvert out-pool-size=4 ! capsfilter caps="video/x-raw, format=(string)RGB;" ! appsink max-buffers=2 drop=True name=sen_0
    split_01. ! queue ! capsfilter caps="video/x-raw, width=(int)1168, height=(int)748;" ! tiovxmultiscaler target=1 ! capsfilter caps="video/x-raw, width=(int)416, height=(int)416;" ! tiovxdlpreproc out-pool-size=4 data-type=3 tensor-format=1 ! capsfilter caps="application/x-tensor-tiovx;" ! appsink max-buffers=2 drop=True name=pre_0
    
    
    ==========[OUTPUT PIPELINE]==========
    
    appsrc do-timestamp=True format=3 block=True name=post_0 ! tiovxdlcolorconvert ! capsfilter caps="video/x-raw, format=(string)NV12, width=(int)1280, height=(int)720;" ! queue ! tiperfoverlay title=Remote Display ! jpegenc ! multipartmux boundary=spionisto ! rndbuffersize max=65000 ! udpsink sync=False clients=127.0.0.1:8081 host=127.0.0.1 port=8081
    

    More on configuring applications: https://software-dl.ti.com/jacinto7/esd/processor-sdk-linux-sk-tda4vm/09_02_00/exports/edgeai-docs/common/configuration_file.html

    I hope this answered your question.

    Thank you,

    Fabiana