This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Linux/AM5728: Gstreamer command for DVR functionality on AM572x GP EVM and on Ubuntu OS

Part Number: AM5728

Tool/software: Linux

We are using
a. Processor SDK : ti-processor-sdk-linux-am57xx-evm-05.01.00.11-Linux-x86-Install.bin
b. Hardware : AM572x General purpose EVM

We working on digital video recorder (DVR) functionality, where we should be able to display LIVE video from the camera and record/save the video onto a file simultaneously on:

Case 1.  AM572x General purpose EVM
Case 2.  Laptop(has in-built Web-Cam) running Ubuntu Linux OS

We are using Gstreamer framework for achieving this, but somehow we are missing something in constructing the Gstreamer pipeline...

So kindly support us by providing correct gstreamer pipelines for both the cases...

Thanks in advance,

Ankush

  • Hello,

    In this case you need gstreamer tee element to split the pipeline in two branches one for save in file on evm and one for network streaming to save the second file on PC.
    Please refer the below guide:
    software-dl.ti.com/.../Foundational_Components_Multimedia.html

    There is an example with tee element but is it encode->save in file and display at same time. For your use case you must place the display branch with streaming element.

    Notes: I would recommend you to split the pipeline after the parser element with the tee. If you will mux the files in container then you could try to connect the tee after the muxer element.
    In this case you will avoid to have 2 encoders or 2 parse elements. This is not valid if you are planing 2 different encoded files for example one h264 file and one mpeg4.


    Hope this helps.

    BR
    Margarita
  • Hi Margarita,

    Thanks for your valuable suggestion....

    The two cases are different from one another. We are not streaming the video over any network as of now.

    Case 1 : capturing the video from camera , display and saving simultaneously in the EVM itself 

    Case 2 : capturing the video from laptop Web camera, display and saving simultaneously on laptop itself

    So can you suggest a gstreamer command each for the above two cases...

    Regards,

    Ankush

  • Hello,

    As I said in previous answer, in the guide there is an example of capture-encode and display in parallel.

    "Capture and Encode and Display in parallel.
    target # gst-launch-1.0 -e v4l2src device=/dev/video1 num-buffers=1000 io-mode=4 ! 'video/x-raw, \
    format=(string)YUY2, width=(int)1280, height=(int)720, framerate=(fraction)30/1' ! vpe num-input-buffers=8 ! tee name=t ! \
    queue ! ducatimpeg4enc bitrate=4000 ! queue ! mpeg4videoparse ! qtmux ! filesink location=x.mp4 t. ! queue ! kmssink
    "

    You could change the encoder and parser if there is a need.

    The difference with the gstreamer on PC is that there is no vpe element so you must replace is with videoconvert and videoscale(if on your PC the gstreamer version is 1.0 or above) is there is a need. The encoder will be different and the display element (kmssink) also. You could check the encoders that available with
    gst-inspect-1.0 | grep "enc"
    for display is
    gst-inspect-1.0 | grep "sink"
    and pick up the elements that you need for your use case.

    BR
    Margarita
  • Note: if you are using kmssink element as display element you must stop weston on the board first.
    /etc/init.d/weston stop


    BR
    Margarita
  • Hello,

    Please if this answers your question click the "This resolved my issue" button.
    Thank you!

    If you have new question/issue you could open a new e2e thread.

    BR
    Margarita
  • Hi ,

    Now I have a Sample video of 1280 x 720 resolution with .mp4 format copied on to SD card.

    On using the gstreamer command that you had suggested ,In place of camera input, i'm giving the video file path in the command... to play it on the display and copy it to another file simultaneously,

    The gstreamer command is :

    gst-launch-1.0 -e filesrc location=/home/root/SampleVideo_1280x720_10mb.mp4 ! 'video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)720, framerate=(fraction)30/1' ! vpe num-input-buffers=8 ! tee name=t  ! queue ! ducatimpeg4enc bitrate=4000 ! queue ! mpeg4videoparse ! qtmux ! filesink location=x.mp4 t. ! queue ! kmssink

    The error I got is:

    Setting pipeline to PAUSED ...
    Pipeline is PREROLLING ...
    ERROR: from element /GstPipeline:pipeline0/GstCapsFilter:capsfilter0: Filter caps do not completely specify the output format
    Additional debug info:
    /home/gtbldadm/processor-sdk-linux-fido-build/build-CORTEX_1/arago-tmp-external-linaro-toolchain/work/cortexa15hf-vfp-neon-linux-gnueabi/gstrea
    mer1.0/1.2.3-r0/gstreamer-1.2.3/plugins/elements/gstcapsfilter.c(348): gst_capsfilter_prepare_buf (): /GstPipeline:pipeline0/GstCapsFilter:caps
    filter0:
    Output caps are unfixed: video/x-raw, format=(string){ NV12, YUYV, YUY2 }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framera
    te=(fraction)[ 0/1, 2147483647/1 ]
    ERROR: pipeline doesn't want to preroll.
    Setting pipeline to NULL ...
    Freeing pipeline ...

    On executing the below command , the sample video is  played on the display of the EVM correctly

    gst-launch-1.0 playbin uri=file:///home/root/SampleVideo_1280x720_10mb.mp4

     

    Before executing the above command , I'm executing below two commands on the EVM

    /etc/init.d/matrix-gui-2.0 stop

    /etc/init.d/weston stop

    Can u guide me on this issue...

    Thanks & Regards,

    Ankush

  • Hello,


    "gst-launch-1.0 -e filesrc location=/home/root/SampleVideo_1280x720_10mb.mp4 ! 'video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)720, framerate=(fraction)30/1' ! vpe num-input-buffers=8 ! tee name=t ! queue ! ducatimpeg4enc bitrate=4000 ! queue ! mpeg4videoparse ! qtmux ! filesink location=x.mp4 t. ! queue ! kmssink"

    This pipeline is wrong when you are trying to decode video.

    When you are using filesrc you must have demuxer, parser decoder etc .
    Please refer the decoding pipeline in the guide.
    target # gst-launch-1.0 -v filesrc location=example_h264.mp4 ! qtdemux ! h264parse ! \
    ducatih264dec ! vpe ! 'video/x-raw, format=(string)NV12, width=(int)720, height=(int)480' ! kmssink
    This is decode->display. If you want to re encode the video you must use tee element again.

    BR
    Margarita
  • Hello,

    Please if this answers your question click the "This resolved my issue" button.
    Thank you!

    If you have new question/issue you could open a new e2e thread.

    BR
    Margarita