This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

AM62A7: Issue with Saving YUV Frame Data to MP4 via GStreamer Pipeline

Part Number: AM62A7


Tool/software:

Hello,

I am working on saving frame data to an MP4 file using appsink, as shown in the attached code. My goal is to receive frame data via appsink and then save it to MP4 on demand, rather than handling everything in a single pipeline.

I've confirmed that the YUV frame data is being received correctly. However, when attempting to save the YUV data to an MP4 file through the pipeline, the file is created, but its attributes are not set correctly.

The pipeline string I am using is as follows:

----------------------------------
std::string pipeline_str = "appsrc name=src is-live=true block=true format=GST_FORMAT_TIME do-timestamp=true ! " "video/x-raw,format=UYVY,width=1920,height=1080,framerate=30/1 ! " "videoconvert ! video/x-raw,format=NV12 ! v4l2h264enc ! h264parse ! mp4mux ! filesink location=" + fileName;

--------------------------------------------------------------

Could you help me identify the issue with the pipeline and the saving process? I need guidance on whether my approach with the pipeline configuration is correct or if there's something I'm missing in the procedure.

Thank you for your help!

gstsave.cpp

  • Hi Hyunwoo,

    What SDK version are you using to verify the above test application? Are you trying to build this application on the actual EVM. 

    Also can you elaborate more on when you say the attributes are not set correctly? Are you not able to play the dumped MP4 file?

    Does the pipeline work without having to use the appsrc/appsink elements like below:

    gst-launch-1.0 -v v4l2src device=/dev/video3  io-mode=5 ! video/x-raw, format=UYVY, width=1920, height=1080, framerate=30/1 ! videoconvert ! video/x-raw, format=NV12 ! v4l2h264enc !  h264parse ! mp4mux ! filesink location=out.mp4

    Just for your information, you should be able to apply the patch below and remove the videoconvert element with linux kernel 6.6 (Which would be part of SDK 10.0 release) as YUV422 formats are supported in the encoder.

    https://e2e.ti.com/cfs-file/__key/communityserver-discussions-components-files/791/6835.v0_2D00_0001_2D00_media_2D00_chips_2D00_media_2D00_wave5_2D00_Support_2D00_one_2D00_planar_2D00_YUV422.patch

    Best Regards,

    Suren

  • I am using the AM62A and TI SDK 9.2.

    In my application, I am strictly applying timestamps to the frames for synchronization with other sensors. Therefore, I am receiving the frames through appsink, managing the timestamps within the appsink function, and then I want to save the frames received from appsink to MP4 via GStreamer.

    Could you help me review the pipeline for this? I would appreciate your advice on the following pipeline:
    -------------------
    std::string pipeline_str = "appsrc name=src is-live=true block=true format=GST_FORMAT_TIME do-timestamp=true ! "
    "video/x-raw,format=UYVY,width=1920,height=1080,framerate=30/1 ! "
    "videoconvert ! video/x-raw,format=NV12 ! v4l2h264enc ! h264parse ! mp4mux ! filesink location=" + fileName;
    ---------------------

    I have also attached the full sample code for this.

    This is an issue I have been struggling with for a long time. I would greatly appreciate your valuable advice.

  • I would like advice on how to correctly connect the mapped data buffer received from appsink to the pipeline for MP4 saving.

    I am managing the timestamps and other metadata within the appsink function and then trying to pass the received frame data to a GStreamer pipeline for encoding and saving to MP4.

    Specifically, I need guidance on how to take the mapped frame buffer from appsink and correctly feed it into the pipeline for processing.

    I am unsure how to properly map the data from the buffer received in appsink and push it to appsrc for encoding and saving. Any advice or example on how to correctly manage this data flow would be greatly appreciated.

    Thank you for your help.

  • gst-launch-1.0 -v v4l2src device=/dev/video3  io-mode=5 ! video/x-raw, format=UYVY, width=1920, height=1080, framerate=30/1 ! videoconvert ! video/x-raw, format=NV12 ! v4l2h264enc !  h264parse ! mp4mux ! filesink location=out.mp4

    ---------------------------------------------------------------------------------------------------------

    The above pipeline results result has the following error

    * Obtain frame data through app sync or
    It was confirmed that jpg storage through the pipeline was captured normally.

    -----------------------------------------------------------------------------------------------------------

    root@am62axx-s1hs:/opt/s1hs-apps# gst-launch-1.0 -v v4l2src device=/dev/video3 io-mode=5 ! video/x-raw, format=UYVY, width=1920, height=1080, framerate=30/1 ! videoconvert ! video/x-raw, format=NV12 ! v4l2h264enc ! h264parse ! mp4mux ! filesink location=out.mp4
    Setting pipeline to PAUSED ...
    Pipeline is live and does not need PREROLL ...
    Pipeline is PREROLLED ...
    Setting pipeline to PLAYING ...
    New clock: GstSystemClock
    /GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709
    /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709
    /GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, interlace-mode=(string)progressive, format=(string)NV12, colorimetry=(string)bt601
    /GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, interlace-mode=(string)progressive, format=(string)NV12, colorimetry=(string)bt601
    /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)1, profile=(string)baseline, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt601
    /GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)1, profile=(string)baseline, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt601
    Redistribute latency...
    /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, interlace-mode=(string)progressive, format=(string)NV12, colorimetry=(string)bt601
    /GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, interlace-mode=(string)progressive, format=(string)NV12, colorimetry=(string)bt601
    /GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink: caps = video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709
    /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709
    ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Failed to allocate required memory.
    Additional debug info:
    ../gst-plugins-good-1.20.7/sys/v4l2/gstv4l2src.c(777): gst_v4l2src_decide_allocation (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
    Buffer pool activation failed
    Execution ended after 0:00:00.006930916
    Setting pipeline to NULL ...
    ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
    Additional debug info:
    ../gstreamer-1.20.7/libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
    streaming stopped, reason not-negotiated (-4)

    (gst-launch-1.0:1812): GStreamer-CRITICAL **: 22:35:13.616: gst_mini_object_copy: assertion 'mini_object != NULL' failed

    (gst-launch-1.0:1812): GStreamer-CRITICAL **: 22:35:13.616: gst_mini_object_unref: assertion 'mini_object != NULL' failed

    (gst-launch-1.0:1812): GStreamer-CRITICAL **: 22:35:13.617: gst_caps_get_structure: assertion 'GST_IS_CAPS (caps)' failed

    (gst-launch-1.0:1812): GStreamer-CRITICAL **: 22:35:13.617: gst_structure_set_value: assertion 'structure != NULL' failed

    (gst-launch-1.0:1812): GStreamer-CRITICAL **: 22:35:13.617: gst_mini_object_unref: assertion 'mini_object != NULL' failed
    Freeing pipeline ...

  • gst-launch-1.0 -v v4l2src device=/dev/video3 io-mode=5 num-buffers=10 ! video/x-raw, format=UYVY, width=1920, height=1080, framerate=30/1 ! tiovxmemalloc pool-size=8 ! videoconvert ! video/x-raw, format=NV12 ! v4l2h264enc ! h264parse ! mp4mux ! filesink location=out.mp4

    For "io-mode=5" settings
    Adding "tivxmemalloc pool-size=8" saves mp4 normally.

    Like the text question, the frame data collected by the appsink
    Pipeline for saving as mp4 any advice