This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Linux/AM5728: GStreamer encode problem

Part Number: AM5728

Tool/software: Linux

Hello,

I want to use gstreamer to encode a video file,and here is my  code:

#include <gst/gst.h> 

int main(int argc, char *argv[])
{	
	GstElement *pipeline, *src, *video_parse, *myvpe, *enc, *sink, *capsfilter1;
	GstCaps *caps1;
	GstBus *bus;
	GstMessage *msg;
	GstStateChangeReturn ret;
	gst_init(&argc, &argv);
	pipeline = gst_pipeline_new("videoencoder");
	src = gst_element_factory_make("filesrc", "src");
	video_parse = gst_element_factory_make("videoparse", "video_parse");
	myvpe = gst_element_factory_make("videoconvert", "myvpe");
	capsfilter1 = gst_element_factory_make("capsfilter", "capsfilter1");
	enc = gst_element_factory_make("ducatih264enc", "enc");
	sink = gst_element_factory_make("filesink", "sink");

	if (!pipeline || !src || !video_parse || !myvpe || !capsfilter1 || !enc || !sink)
	{
		g_printerr("One element could not be created.Exiting.\n");
		return -1;
	}
	g_object_set(G_OBJECT(src), "location", "/mnt/akiyo_qcif.yuv", NULL);
	g_object_set(G_OBJECT(video_parse), "width", "176", NULL);
	g_object_set(G_OBJECT(video_parse), "height", "144", NULL);
	g_object_set(G_OBJECT(video_parse), "format", 2, NULL);
	g_object_set(G_OBJECT(video_parse), "framerate", 25,1, NULL);
	caps1 = gst_caps_new_simple("video/x-raw",
		"format", G_TYPE_STRING, "NV12",
		"width", G_TYPE_INT, 176,
		"height", G_TYPE_INT, 144,
		"framerate", GST_TYPE_FRACTION, 25, 1,
		NULL);
	g_object_set(G_OBJECT(capsfilter1), "caps", caps1, NULL);
	gst_object_unref(caps1);
	g_object_set(G_OBJECT(sink), "location", "/mnt/b2b.h264", NULL);
	gst_bin_add_many(GST_BIN(pipeline), src, video_parse, myvpe, capsfilter1, enc, sink, NULL);
	if (gst_element_link_many(src, video_parse, myvpe, capsfilter1, enc, sink) != TRUE)
	{ 
		g_printerr("Elements could not be linked.\n");
		gst_object_unref(pipeline);
		return -1;
	}
	ret = gst_element_set_state(pipeline, GST_STATE_PLAYING);
	if (ret == GST_STATE_CHANGE_FAILURE) {
		g_printerr("Unable to set the pipeline to the playing state.\n");
		gst_object_unref(pipeline);
		return -1;
	}

	bus = gst_element_get_bus(pipeline);
	msg = gst_bus_timed_pop_filtered(bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);

	if (msg != NULL)
		gst_message_unref(msg);
	gst_object_unref(bus);
	gst_element_set_state(pipeline, GST_STATE_NULL);
	gst_object_unref(pipeline);
	return 0;
}

Then,I ran this code.It encountered a problem:

(gst-test:1185): GLib-GObject-CRITICAL **: g_object_unref: assertion 'G_IS_OBJECT (object)' failed

Elements could not be linked.

However,I use the same pipeline with command line successful:

gst-launch-1.0 filesrc location=/mnt/akiyo_qcif.yuv ! videoparse width=176 height=144 format=i420 ! videoconvert ! 'video/x-raw, format=(string)NV12, framerate=(fraction)25/1, width=(int)176, height=(int)144'  ! ducatih264enc ! filesink location=x.h264

so,could you tell me how to deal with this problem.Thank you.

  • Hello,

    I will check the code and get back to you.
    Please could you add debug log --gst-debug=2 also?

    BR
    Margarita
  • Hello,
    Sorry,I do not know how to add a function in my code to output the log.And I know I can add "--gst-debug=2" before my command line to get the debug log .However,my problem is that the same pipeline could encode I420 video into h264 with the command line and I created the same pipeline in the form of code.Finally,it failed and output the message: (gst-test:1185): GLib-GObject-CRITICAL **: g_object_unref: assertion 'G_IS_OBJECT (object)' failed.
  • Hello,
    Thank you for your suggestions,and I tried : ./gst-test --gst-debug=2,capsfilter*:6,and ouput:
    0:00:00.084635141 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:213:gst_capsfilter_set_property:<capsfilter1> set new caps video/x-raw, format=(string)NV12, width=(int)176, height=(int)144, framerate=(fraction)25/1

    (gst-test:1252): GLib-GObject-CRITICAL **: g_object_unref: assertion 'G_IS_OBJECT (object)' failed
    0:00:00.085345507 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:297:gst_capsfilter_transform_caps:<capsfilter1> input: ANY
    0:00:00.085386824 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:298:gst_capsfilter_transform_caps:<capsfilter1> filter: (NULL)
    0:00:00.085420821 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:300:gst_capsfilter_transform_caps:<capsfilter1> caps filter: video/x- raw, format=(string)NV12, width=(int)176, height=(int)144, framerate=(fraction)25/1
    0:00:00.085477917 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:301:gst_capsfilter_transform_caps:<capsfilter1> intersect: video/x-raw, format=(string)NV12, width=(int)176, height=(int)144, framerate=(fraction)25/1
    0:00:00.085586579 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:297:gst_capsfilter_transform_caps:<capsfilter1> input: ANY
    0:00:00.085621064 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:298:gst_capsfilter_transform_caps:<capsfilter1> filter: (NULL)
    0:00:00.085651808 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:300:gst_capsfilter_transform_caps:<capsfilter1> caps filter: video/x-raw, format=(string)NV12, width=(int)176, height=(int)144, framerate=(fraction)25/1
    0:00:00.085703861 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:301:gst_capsfilter_transform_caps:<capsfilter1> intersect: video/x-raw, format=(string)NV12, width=(int)176, height=(int)144, framerate=(fraction)25/1
    0:00:00.085843267 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:297:gst_capsfilter_transform_caps:<capsfilter1> input: video/x-raw, format=(string)NV12, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-raw, format=(string)YUYV, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-raw, format=(string)YUY2, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
    0:00:00.085935987 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:298:gst_capsfilter_transform_caps:<capsfilter1> filter: (NULL)
    0:00:00.085966731 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:300:gst_capsfilter_transform_caps:<capsfilter1> caps filter: video/x-raw, format=(string)NV12, width=(int)176, height=(int)144, framerate=(fraction)25/1
    0:00:00.086018459 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:301:gst_capsfilter_transform_caps:<capsfilter1> intersect: video/x-raw, format=(string)NV12, width=(int)176, height=(int)144, framerate=(fraction)25/1
    0:00:00.086158189 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:297:gst_capsfilter_transform_caps:<capsfilter1> input: video/x-raw, format=(string)NV12, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-raw, format=(string)YUYV, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-raw, format=(string)YUY2, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
    0:00:00.086244077 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:298:gst_capsfilter_transform_caps:<capsfilter1> filter: (NULL)
    0:00:00.436301279 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:300:gst_capsfilter_transform_caps:<capsfilter1> caps filter: video/x-raw, format=(string)NV12, width=(int)176, height=(int)144, framerate=(fraction)25/1
    0:00:00.436362442 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:301:gst_capsfilter_transform_caps:<capsfilter1> intersect: video/x-raw, format=(string)NV12, width=(int)176, height=(int)144, framerate=(fraction)25/1
    0:00:00.436513396 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:297:gst_capsfilter_transform_caps:<capsfilter1> input: video/x-raw, format=(string)NV12, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-raw, format=(string)YUYV, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-raw, format=(string)YUY2, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
    0:00:00.436592290 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:298:gst_capsfilter_transform_caps:<capsfilter1> filter: (NULL)
    0:00:00.436619618 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:300:gst_capsfilter_transform_caps:<capsfilter1> caps filter: video/x-raw, format=(string)NV12, width=(int)176, height=(int)144, framerate=(fraction)25/1
    0:00:00.436662562 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:301:gst_capsfilter_transform_caps:<capsfilter1> intersect: video/x-raw, format=(string)NV12, width=(int)176, height=(int)144, framerate=(fraction)25/1
    0:00:00.436860527 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:297:gst_capsfilter_transform_caps:<capsfilter1> input: video/x-raw, format=(string)NV12, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-raw, format=(string)YUYV, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-raw, format=(string)YUY2, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
    0:00:00.436940071 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:298:gst_capsfilter_transform_caps:<capsfilter1> filter: (NULL)
    0:00:00.436965935 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:300:gst_capsfilter_transform_caps:<capsfilter1> caps filter: video/x-raw, format=(string)NV12, width=(int)176, height=(int)144, framerate=(fraction)25/1
    0:00:00.437009367 1252 0x13c780 DEBUG capsfilter gstcapsfilter.c:301:gst_capsfilter_transform_caps:<capsfilter1> intersect: video/x-raw, format=(string)NV12, width=(int)176, height=(int)144, framerate=(fraction)25/1
    Elements could not be linked.

    So,do you know what problem may I met in my code?Thank you for your help!
  • Hello,

    Could you comment this line :
    g_object_set(G_OBJECT(video_parse), "framerate", 25,1, NULL);
    I do not see it in the pipeline that you are running.

    BR
    Margarita
  • Hello,

    You could check this e2e thread also:
    e2e.ti.com/.../518974

    BR
    Margarita
  • Hello,
    Oh,sorry,I had set the property of the "videoparse" element .But you did not notice that.So,could you give me some other advices.And I am going to the website that you gave me to find some useful information.Thanks anyway.
  • Hello,

    I know that framerate properties is set for videoparse element. I just do not see this property to be set in the pipeline that you executed in the console. The debug log that you provided is from capsfilter. Please provide full debug log.

    BR
    Margarita
  • Hello,

    One more point I would recommend you to check videoparse's properties.
    gstreamer.freedesktop.org/.../gst-plugins-bad-plugins-videoparse.html
    Could you try to remove "" in "176" and "144", height and width are type gint.
    I would also recommend you to recheck the capsfilter settings.

    BR
    Margarita
  • Hello,

    I tried what you suggested,but it didn't work.And then I tried the following code which can capture and encode into an avi file.

    At the beginning,I used the comand line to test the pipeline.It could achieve the function :capture and encode into an avi file.

    gst-launch-1.0 -e v4l2src device=/dev/video1 num-buffers=1000 io-mode=4 ! 'video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)720, framerate=(fraction)30/1' ! vpe num-input-buffers=8 ! queue ! ducatih264enc bitrate=4000 ! queue ! h264parse ! qtmux ! filesink location=x.avi

    However,the same problem accured when I write my own app.I ran the code as the following shows:

    #include <gst/gst.h> 
    
    /**************************可行测试管道****************************
    gst-launch-1.0 -e v4l2src device=/dev/video1 num-buffers=1000 io-mode=4 ! 'video/x-raw, 
    format=(string)YUY2, width=(int)1280, height=(int)720, framerate=(fraction)30/1' ! vpe num-input-buffers=8 ! 
    queue ! ducatimpeg4enc bitrate=4000 ! queue ! mpeg4videoparse ! qtmux ! filesink location=x.mp4
    
    root@am57xx - evm:~# gst-launch-1.0 -e v4l2src device=/dev/video1 num-buffers=1000 io-mode=4 !'video/x-raw, 
    format=(string)YUY2, width=(int)1280, height=(int)720, framerate=(fraction)30/1' ! vpe num-input-buffers=8 !
    queue !ducatih264enc bitrate = 4000 !queue !h264parse !qtmux !filesink location = x.avi
    
    mount - t nfs - o nolock, nfsvers = 3 - o proto = tcp 10.8.198.148: / home / nfs / mnt
    
    gcc gst - test.c - o gst - test `pkg-config --cflags --libs gstreamer - 1.0`
    *******************************************************************/
    
    int main(int argc, char *argv[])
    {
    	GstElement *pipeline, *src, *filter, *myvpe, *q1, *enc, *q2, *parse, *mux, *sink;//需要过滤器
    	GstCaps *caps1;
    	GstBus *bus;
    	GstMessage *msg;
    	GstStateChangeReturn ret;
    	gst_init(&argc, &argv);
    
    	pipeline = gst_pipeline_new("videoencoder");
    
    	src = gst_element_factory_make("v4l2src", "src");
    	filter = gst_element_factory_make("capsfilter", "filter");
    	myvpe = gst_element_factory_make("vpe", "myvpe");
    	q1 = gst_element_factory_make("queue", "q1");
    	enc = gst_element_factory_make("ducatih264enc", "enc");
    	q2 = gst_element_factory_make("queue", "q2");
    	parse = gst_element_factory_make("h264parse", "parse");
    	mux = gst_element_factory_make("qtmux", "mux");
    	sink = gst_element_factory_make("filesink", "sink");
    
    	if (!pipeline || !src || !filter || !myvpe || !q1 || !enc || !q2 || !parse || !mux || !sink)
    	{
    		g_printerr("One element could not be created.Exiting.\n");
    		return -1;
    	}
    
    	g_object_set(G_OBJECT(src), "device", "/dev/video1", NULL);
    	g_object_set(G_OBJECT(src), "num-buffers", 1000, NULL);
    	g_object_set(G_OBJECT(src), "io-mode", 4, NULL);
    
    	caps1 = gst_caps_new_simple("video/x-raw",
    		"format", G_TYPE_STRING, "YUY2",
    		"width", G_TYPE_INT, 1280,
    		"height", G_TYPE_INT, 720,
    		"framerate", GST_TYPE_FRACTION, 30, 1,
    		NULL);
    
    	g_object_set(G_OBJECT(filter), "caps", caps1, NULL);
    	gst_object_unref(caps1);
    
    	g_object_set(G_OBJECT(myvpe), "num-input-buffers", 8, NULL);
    
    	g_object_set(G_OBJECT(enc), "bitrate", 4000, NULL);
    
    	g_object_set(G_OBJECT(sink), "location", "/home/root/81.avi", NULL);
    
    	gst_bin_add_many(GST_BIN(pipeline), src, filter, myvpe, q1, enc, q2, parse, mux, sink, NULL);
    	if (gst_element_link_many(src, filter, myvpe, q1, enc, q2, parse, mux, sink) != TRUE)
    	{
    		g_printerr("Elements could not be linked.\n");
    		gst_object_unref(pipeline);
    		return -1;
    	}
    	ret = gst_element_set_state(pipeline, GST_STATE_PLAYING);
    	if (ret == GST_STATE_CHANGE_FAILURE) {
    		g_printerr("Unable to set the pipeline to the playing state.\n");
    		gst_object_unref(pipeline);
    		return -1;
    	}
    
    	bus = gst_element_get_bus(pipeline);
    	msg = gst_bus_timed_pop_filtered(bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);
    
    	if (msg != NULL)
    		gst_message_unref(msg);
    	gst_object_unref(bus);
    	gst_element_set_state(pipeline, GST_STATE_NULL);
    	gst_object_unref(pipeline);
    	return 0;
    }

    It showed the same problem:

    (capture:1264): GLib-GObject-CRITICAL **: g_object_unref: assertion 'G_IS_OBJECT (object)' failed
    Elements could not be linked.

    It makes me confused ang could you advice me any other ways to solve the problem.Thanks so much!

  • Hello,

    Please try this code on your side. The pipeline that it builds is with videotestsrc.

    #include <stdio.h>
    #include <gst/gst.h>


    /* BUS function for ERROR */

    static gboolean bus_call(GstBus *bus, GstMessage *msg, gpointer data)
    {
    GMainLoop *loop = (GMainLoop *) data;
    switch (GST_MESSAGE_TYPE (msg))
    {
    case GST_MESSAGE_EOS:
    g_print ("End of stream\n");
    g_main_loop_quit (loop);
    break;

    case GST_MESSAGE_ERROR:
    {
    gchar *debug;
    GError *error;
    gst_message_parse_error (msg, &error, &debug);
    g_free (debug);
    g_printerr ("Error: %s\n", error->message);
    g_error_free (error);
    g_main_loop_quit (loop);
    break;
    }

    default:
    break;
    }

    return TRUE;
    }


    static gboolean link_source_element_with_filter (GstElement *element1,
    GstElement *element2)

    {
    /* CAPS to be link:
    * 'video/x-raw, format=(string)YUY2, width=(int)800, height=(int)600, framerate=5/1'
    * */

    gboolean link_ok;
    GstCaps *caps;

    caps = gst_caps_new_simple ("video/x-raw",
    "format", G_TYPE_STRING, "NV12",
    "width", G_TYPE_INT, 800,
    "height", G_TYPE_INT, 600,
    "framerate", GST_TYPE_FRACTION, 5, 1,
    NULL);

    link_ok = gst_element_link_filtered (element1, element2, caps);
    gst_caps_unref (caps);

    if (!link_ok) {
    g_warning ("Failed to link element1 and element2!(v4l2src->convert)");
    }
    return link_ok;

    }


    /* Main function does atual processing */

    int main (int argc , char *argv[])

    //gst-launch-1.0 -e videotestsrc ! 'video/x-raw, format=(string)NV12, width=(int)1280, height=(int)720, framerate=(fraction)30/1' ! vpe num-input-buffers=8 ! queue ! ducatih264enc bitrate=4000 ! queue ! h264parse ! qtmux ! filesink location=x.mov
    {
    GstElement *pipeline, *src, *convert, *q1, * enc, *q2, *parse, *mux, *sink;
    GstBus *bus;
    GstMessage *msg;


    gst_init (&argc, &argv);

    /* Create the empty pipeline */
    pipeline = gst_pipeline_new ("test-pipeline");

    /* Create the elements */
    src = gst_element_factory_make ("videotestsrc", "video_source");
    g_object_set (G_OBJECT (src), "num-buffers", 50, NULL);
    convert = gst_element_factory_make ("vpe", "convert");
    g_object_set (G_OBJECT (src), "num-input-buffers", 8, NULL);
    q1 = gst_element_factory_make ("queue", "q1");
    enc = gst_element_factory_make ("ducatih264enc", "enc");
    q2 = gst_element_factory_make ("queue", "q2");
    parse = gst_element_factory_make ("h264parse", "parser");
    mux = gst_element_factory_make ("qtmux", "mux");
    sink = gst_element_factory_make ("filesink", "sink");
    g_object_set (G_OBJECT (sink), "location", "/home/root/x.mov", NULL);


    /* Link all elements that can be automatically linked because they have "Always" pads */


    gst_bin_add_many (GST_BIN (pipeline), src, convert, q1, enc, q2, parse, mux, sink, NULL);

    link_source_element_with_filter (src, convert);
    gst_element_link_many (convert, q1, enc, q2, parse, mux, sink, NULL);




    /* Start playing the pipeline */
    gst_element_set_state (pipeline, GST_STATE_PLAYING);

    /* Wait until error or EOS */
    bus = gst_element_get_bus (pipeline);
    msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);



    /* Free resources */
    if (msg != NULL)
    {
    gst_message_unref (msg);
    }
    gst_object_unref (bus);
    gst_element_set_state (pipeline, GST_STATE_NULL);

    gst_object_unref (pipeline);
    return 0;
    }

    Let me know the result.

    BR
    Margarita
  • Hello,
    I tried your code.And it did not have any mistakes while it did not have any output(phenomenon) in the AM5728.However,I have got a new idea to change my code because of your code.And I will try it as soon as possible.Thanks a lot!
  • Hello,
    It just show the message:[ 770.342944] omap-iommu 55082000.mmu: 55082000.mmu: version 2.1.
    But I do not know the meaning.
  • Hello,

    The video file will be in dir /home/root/x.mov please check in there for the output file.


    Here is also the example code which is working on my PC.
    #include <stdio.h>
    #include <gst/gst.h>


    /* BUS function for ERROR */

    static gboolean bus_call(GstBus *bus, GstMessage *msg, gpointer data)
    {
    GMainLoop *loop = (GMainLoop *) data;
    switch (GST_MESSAGE_TYPE (msg))
    {
    case GST_MESSAGE_EOS:
    g_print ("End of stream\n");
    g_main_loop_quit (loop);
    break;

    case GST_MESSAGE_ERROR:
    {
    gchar *debug;
    GError *error;
    gst_message_parse_error (msg, &error, &debug);
    g_free (debug);
    g_printerr ("Error: %s\n", error->message);
    g_error_free (error);
    g_main_loop_quit (loop);
    break;
    }

    default:
    break;
    }

    return TRUE;
    }


    static gboolean link_source_element_with_filter (GstElement *element1,
    GstElement *element2)

    {
    /* CAPS to be link:
    * 'video/x-raw, format=(string)YUY2, width=(int)800, height=(int)600, framerate=5/1'
    * */

    gboolean link_ok;
    GstCaps *caps;

    caps = gst_caps_new_simple ("video/x-raw",
    "format", G_TYPE_STRING, "NV12",
    "width", G_TYPE_INT, 800,
    "height", G_TYPE_INT, 600,
    "framerate", GST_TYPE_FRACTION, 5, 1,
    NULL);

    link_ok = gst_element_link_filtered (element1, element2, caps);
    gst_caps_unref (caps);

    if (!link_ok) {
    g_warning ("Failed to link element1 and element2!(v4l2src->convert)");
    }
    return link_ok;

    }


    /* Main function does atual processing */

    int main (int argc , char *argv[])

    //gst-launch-1.0 -e videotestsrc ! 'video/x-raw, format=(string)NV12, width=(int)1280, height=(int)720, framerate=(fraction)30/1' !videoconvert ! queue ! x264enc ! queue ! h264parse ! qtmux ! filesink location=x.mov
    {
    GstElement *pipeline, *src, *convert, *q1, * enc, *q2, *parse, *mux, *sink;
    GstBus *bus;
    GstMessage *msg;


    gst_init (&argc, &argv);

    /* Create the empty pipeline */
    pipeline = gst_pipeline_new ("test-pipeline");

    /* Create the elements */
    src = gst_element_factory_make ("videotestsrc", "video_source");
    g_object_set (G_OBJECT (src), "num-buffers", 50, NULL);
    convert = gst_element_factory_make ("videoconvert", "convert");
    //g_object_set (G_OBJECT (src), "num-input-buffers", 8, NULL);
    q1 = gst_element_factory_make ("queue", "q1");
    enc = gst_element_factory_make ("x264enc", "enc");
    q2 = gst_element_factory_make ("queue", "q2");
    parse = gst_element_factory_make ("h264parse", "parser");
    mux = gst_element_factory_make ("qtmux", "mux");
    sink = gst_element_factory_make ("filesink", "sink");
    g_object_set (G_OBJECT (sink), "location", "/home/mms/Desktop/gsttests/x.mov", NULL);


    /* Link all elements that can be automatically linked because they have "Always" pads */


    gst_bin_add_many (GST_BIN (pipeline), src, convert, q1, enc, q2, parse, mux, sink, NULL);

    link_source_element_with_filter (src, convert);
    gst_element_link_many (convert, q1, enc, q2, parse, mux, sink, NULL);




    /* Start playing the pipeline */
    gst_element_set_state (pipeline, GST_STATE_PLAYING);

    /* Wait until error or EOS */
    bus = gst_element_get_bus (pipeline);
    msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);



    /* Free resources */
    if (msg != NULL)
    {
    gst_message_unref (msg);
    }
    gst_object_unref (bus);
    gst_element_set_state (pipeline, GST_STATE_NULL);

    gst_object_unref (pipeline);
    return 0;
    }

  • Hello,
    I tried your code.It can encode and decode successfully.So I learned your way to build a pipeline like this:

    #include <gst/gst.h>

    /**************************可行测试管道****************************
    gst-launch-1.0 -e v4l2src device=/dev/video1 num-buffers=1000 io-mode=4 ! 'video/x-raw,
    format=(string)YUY2, width=(int)1280, height=(int)720, framerate=(fraction)30/1' ! vpe num-input-buffers=8 !
    queue ! ducatimpeg4enc bitrate=4000 ! queue ! mpeg4videoparse ! qtmux ! filesink location=x.mp4

    root@am57xx - evm:~# gst-launch-1.0 -e v4l2src device=/dev/video1 num-buffers=1000 io-mode=4 !'video/x-raw,
    format=(string)YUY2, width=(int)1280, height=(int)720, framerate=(fraction)30/1' ! vpe num-input-buffers=8 !
    queue !ducatih264enc bitrate = 4000 !queue !h264parse !qtmux !filesink location = x.avi

    mount - t nfs - o nolock, nfsvers = 3 - o proto = tcp 10.8.198.148: / home / nfs / mnt

    gcc gst - test.c - o gst - test `pkg-config --cflags --libs gstreamer - 1.0`
    *******************************************************************/


    static gboolean link_source_element_with_filter(GstElement *element1,
    GstElement *element2)

    {
    /* CAPS to be link:
    * 'video/x-raw, format=(string)YUY2, width=(int)800, height=(int)600, framerate=5/1'
    * */

    gboolean link_ok;
    GstCaps *caps;

    caps = gst_caps_new_simple("video/x-raw",
    "format", G_TYPE_STRING, "YUY2",
    "width", G_TYPE_INT, 1280,
    "height", G_TYPE_INT, 720,
    "framerate", GST_TYPE_FRACTION, 30, 1,
    NULL);

    link_ok = gst_element_link_filtered(element1, element2, caps);
    gst_caps_unref(caps);

    if (!link_ok) {
    g_warning("Failed to link element1 and element2!(v4l2src->convert)");
    }
    return link_ok;

    }

    int main(int argc, char *argv[])
    {
    GstElement *pipeline, *src, *myvpe, *q1, *enc, *q2, *parse, *mux, *sink;//需要过滤器
    GstCaps *caps1;
    GstBus *bus;
    GstMessage *msg;
    GstStateChangeReturn ret;
    gst_init(&argc, &argv);

    pipeline = gst_pipeline_new("videoencoder");

    src = gst_element_factory_make("v4l2src", "src");

    //filter = gst_element_factory_make("capsfilter", "filter");

    myvpe = gst_element_factory_make("vpe", "myvpe");
    q1 = gst_element_factory_make("queue", "q1");
    enc = gst_element_factory_make("ducatih264enc", "enc");
    q2 = gst_element_factory_make("queue", "q2");
    parse = gst_element_factory_make("h264parse", "parse");
    mux = gst_element_factory_make("qtmux", "mux");
    sink = gst_element_factory_make("filesink", "sink");

    if (!pipeline || !src || !myvpe || !q1 || !enc || !q2 || !parse || !mux || !sink)
    {
    g_printerr("One element could not be created.Exiting.\n");
    return -1;
    }

    g_object_set(G_OBJECT(src), "device", "/dev/video1", NULL);
    g_object_set(G_OBJECT(src), "num-buffers", 1000, NULL);
    g_object_set(G_OBJECT(src), "io-mode", 4, NULL);

    g_object_set(G_OBJECT(myvpe), "num-input-buffers", 8, NULL);

    g_object_set(G_OBJECT(enc), "bitrate", 4000, NULL);

    g_object_set(G_OBJECT(sink), "location", "/home/root/81.avi", NULL);

    gst_bin_add_many(GST_BIN(pipeline), src, myvpe, q1, enc, q2, parse, mux, sink, NULL);

    link_source_element_with_filter(src, myvpe);

    if (gst_element_link_many(myvpe, q1, enc, q2, parse, mux, sink) != TRUE)
    {
    g_printerr("Elements could not be linked.\n");
    gst_object_unref(pipeline);
    return -1;
    }
    ret = gst_element_set_state(pipeline, GST_STATE_PLAYING);
    if (ret == GST_STATE_CHANGE_FAILURE) {
    g_printerr("Unable to set the pipeline to the playing state.\n");
    gst_object_unref(pipeline);
    return -1;
    }

    bus = gst_element_get_bus(pipeline);
    msg = gst_bus_timed_pop_filtered(bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);

    if (msg != NULL)
    gst_message_unref(msg);
    gst_object_unref(bus);
    gst_element_set_state(pipeline, GST_STATE_NULL);
    gst_object_unref(pipeline);
    return 0;
    }

    Howwever,it showed the message: Elements could not be linked.
    What I did not understand is that I only changed the src(videotestsrc ->v4l2src),and it did not work?Was that I miss something in the pipeline or did the wrong settings?
  • Hello,

    1.Have you tested this pipeline in the console?
    g st-launch-1.0 -e v4l2src device=/dev/video1 num-buffers=1000 io-mode=4 ! 'video/x-raw,
    format=(string)YUY2, width=(int)1280, height=(int)720, framerate=(fraction)30/1' ! vpe num-input-buffers=8 !
    queue ! ducatih264enc bitrate = 4000 ! queue ! h264parse ! qtmux ! filesink location = x.avi

    2. I would recommend you to add gst-debug so yo could check which elements are not linked.

    BR
    Margarita
  • Hello,
    Thangs for your reply! I have tested the pipeline and it worked successfully.
    I added gst-debug just like this:--gst-debug=2,ducatih264enc*:6
    and it showed :
    0:00:00.105707306 1388 0x143ce0 LOG qtmux gstqtmux.c:4624:gst_qt_mux_register: Registering muxers
    0:00:00.106825314 1388 0x143ce0 LOG qtmux gstqtmux.c:4662:gst_qt_mux_register: Finished registering muxers
    0:00:00.106858173 1388 0x143ce0 LOG qtmux gstqtmux.c:4668:gst_qt_mux_register: Registering tags
    0:00:00.106889079 1388 0x143ce0 LOG qtmux gstqtmux.c:4674:gst_qt_mux_register: Finished registering tags
    0:00:00.109600895 1388 0x143ce0 DEBUG qtmux gstqtmux.c:4368:gst_qt_mux_request_new_pad:<mux> Requested pad: video_0
    Elements could not be linked.
    0:00:00.110273197 1388 0x143ce0 DEBUG qtmux gstqtmux.c:4308:gst_qt_mux_release_pad:<mux> Releasing mux:video_0
    0:00:00.110310122 1388 0x143ce0 DEBUG qtmux gstqtmux.c:4312:gst_qt_mux_release_pad: Checking mux:video_0

    Then,I ran the command line: gst-inspect-1.0 qtmux
    and I knew that qtmux need a requested pad because its sink pad isn't an always pad.

    What confused me is that you didn't add the requested pad and your code worked with no problem.So I really want to konw what is the reason of it.

    And could give me your qq or any other ways to contact you conveniently if you don't mind. Thanks a lot!
  • Hello,

    Could you try to put queue element in here:

    ... ! h264parse ! queue ! qtmux ....

    Let me know the result.

    BR
    Margarita

  • Hello,


    You could manually request the qtmux pads also if the above suggestion does not work in your case.
    In fact in the log gst_qt_mux_request_new_pad:<mux> Requested pad: video_0 it seems that the pad is requested.
    One more what you could try is manually link the pads of the all elements by gst_element_link_pads function.

    Let me know the results.

    BR
    Margarita

  • Hello,
    I have to add a requested pad to the element "qtmux",
    #include <gst/gst.h>

    static gboolean link_source_element_with_filter(GstElement *element1,
    GstElement *element2)
    {
    gboolean link_ok;
    GstCaps *caps;

    caps = gst_caps_new_simple("video/x-raw",
    "format", G_TYPE_STRING, "YUY2",
    "width", G_TYPE_INT, 1280,
    "height", G_TYPE_INT, 720,
    "framerate", GST_TYPE_FRACTION, 30, 1,
    NULL);

    link_ok = gst_element_link_filtered(element1, element2, caps);
    gst_caps_unref(caps);

    if (!link_ok) {
    g_warning("Failed to link element1 and element2!(v4l2src->convert)");
    }
    return link_ok;
    }

    int main(int argc, char *argv[])
    {
    GstElement *pipeline, *src, *myvpe, *q1, *enc, *q2, *parse, *mux, *sink;//需要过滤器
    GstCaps *caps1;
    GstBus *bus;

    GstPadTemplate *mux_sink_pad_template;
    GstPad *pad1, *pad2;
    gchar *name;


    GstMessage *msg;
    GstStateChangeReturn ret;
    gst_init(&argc, &argv);

    pipeline = gst_pipeline_new("videoencoder");

    src = gst_element_factory_make("v4l2src", "src");

    //filter = gst_element_factory_make("capsfilter", "filter");

    myvpe = gst_element_factory_make("vpe", "myvpe");
    q1 = gst_element_factory_make("queue", "q1");
    enc = gst_element_factory_make("ducatih264enc", "enc");
    q2 = gst_element_factory_make("queue", "q2");
    parse = gst_element_factory_make("h264parse", "parse");
    mux = gst_element_factory_make("qtmux", "mux");
    sink = gst_element_factory_make("filesink", "sink");

    if (!pipeline || !src || !myvpe || !q1 || !enc || !q2 || !parse || !mux || !sink)
    {
    g_printerr("One element could not be created.Exiting.\n");
    return -1;
    }

    g_object_set(G_OBJECT(src), "device", "/dev/video1", NULL);
    g_object_set(G_OBJECT(src), "num-buffers", 1000, NULL);
    g_object_set(G_OBJECT(src), "io-mode", 4, NULL);

    g_object_set(G_OBJECT(myvpe), "num-input-buffers", 8, NULL);

    g_object_set(G_OBJECT(enc), "bitrate", 4000, NULL);

    /*******************one way to add the requested pad *************************8/
    /*mux_sink_pad_template = gst_element_class_get_pad_template(GST_ELEMENT_GET_CLASS(mux), "sink_%d");
    if (mux_sink_pad_template==NULL)
    g_printerr("hsg\n");
    pad1 = gst_element_request_pad(mux, mux_sink_pad_template, NULL, NULL);
    pad2 = gst_element_get_static_pad(parse, "src");

    if (pad2 == NULL)
    g_printerr("hsg1\n");

    gst_pad_link(pad2, pad1);*/

    /*********************another way to add the requested pad***************************/
    pad2 = gst_element_get_static_pad(parse, "src");
    pad1 = gst_element_get_compatible_pad(mux, pad2, NULL);
    gst_pad_link(pad2, pad1);

    g_object_set(G_OBJECT(sink), "location", "/home/root/81.avi", NULL);

    gst_bin_add_many(GST_BIN(pipeline), src, myvpe, q1, enc, q2, parse, mux, sink, NULL);

    link_source_element_with_filter(src, myvpe);

    if (gst_element_link_many(myvpe, q1, enc, q2, parse, mux, sink) != TRUE)
    {
    g_printerr("Elements could not be linked.\n");
    gst_object_unref(pipeline);
    return -1;
    }
    ret = gst_element_set_state(pipeline, GST_STATE_PLAYING);
    if (ret == GST_STATE_CHANGE_FAILURE) {
    g_printerr("Unable to set the pipeline to the playing state.\n");
    gst_object_unref(pipeline);
    return -1;
    }

    bus = gst_element_get_bus(pipeline);
    msg = gst_bus_timed_pop_filtered(bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);

    if (msg != NULL)
    gst_message_unref(msg);
    gst_object_unref(bus);

    gst_object_unref(GST_OBJECT(pad1));

    gst_element_set_state(pipeline, GST_STATE_NULL);
    gst_object_unref(pipeline);
    return 0;
    }

    The result of the code is :
    0:00:00.103270722 1457 0x143ee0 LOG qtmux gstqtmux.c:4624:gst_qt_mux_register: Registering muxers
    0:00:00.104316832 1457 0x143ee0 LOG qtmux gstqtmux.c:4662:gst_qt_mux_register: Finished registering muxers
    0:00:00.104350504 1457 0x143ee0 LOG qtmux gstqtmux.c:4668:gst_qt_mux_register: Registering tags
    0:00:00.104381085 1457 0x143ee0 LOG qtmux gstqtmux.c:4674:gst_qt_mux_register: Finished registering tags
    0:00:00.105626298 1457 0x143ee0 DEBUG qtmux gstqtmux.c:4368:gst_qt_mux_request_new_pad:<mux> Requested pad: video_0
    Elements could not be linked.
    0:00:00.107839866 1457 0x143ee0 DEBUG qtmux gstqtmux.c:4308:gst_qt_mux_release_pad:<mux> Releasing mux:video_0
    0:00:00.107876954 1457 0x143ee0 DEBUG qtmux gstqtmux.c:4312:gst_qt_mux_release_pad: Checking mux:video_0

    I didn't konw whether the way I add the requested pad was wrong or not?
    So could you tell me if there is a problem in my code(I tried two ways to make the requested pad above.)
    Thanks a lot!
  • Hello,

    1.Could you change video_%d to video_%u and try?
    With gst-inspect-1.0 qtmux you could check what the mux has on the sink.

    2. If the above does not work please could you comment the qtmux element and connect the h264parse to filesink just to verify that issue become from qtmux.

    Let me know.

    BR
    Margarita
  • Hello,

    Please check this tutorial about the pads:
    gstreamer.freedesktop.org/.../pads.html

    BR
    Margarita
  • Hello,

    I am sorry now I saw the both versions in your code for requested pads.
    I will try on my side and I will post the code here when I am done.

    BR
    Margarita
  • Hello,

    You could try the gst code below.
    As you could see , the mux is linked manually(link_to_multiplexer function). I linked the gst element's pads manually also.
    One more queue is added between the parse and mux element.
    Please change the caps filter resolution, framerate and format per your use case.

    #include <stdio.h>
    #include <gst/gst.h>


    /* BUS function for ERROR */

    static gboolean bus_call(GstBus *bus, GstMessage *msg, gpointer data)
    {
    GMainLoop *loop = (GMainLoop *) data;
    switch (GST_MESSAGE_TYPE (msg))
    {
    case GST_MESSAGE_EOS:
    g_print ("End of stream\n");
    g_main_loop_quit (loop);
    break;

    case GST_MESSAGE_ERROR:
    {
    gchar *debug;
    GError *error;
    gst_message_parse_error (msg, &error, &debug);
    g_free (debug);
    g_printerr ("Error: %s\n", error->message);
    g_error_free (error);
    g_main_loop_quit (loop);
    break;
    }

    default:
    break;
    }

    return TRUE;
    }


    static gboolean link_source_element_with_filter (GstElement *element1,
    GstElement *element2)

    {
    /* CAPS to be link:
    * 'video/x-raw, format=(string)YUY2, width=(int)800, height=(int)600, framerate=5/1'
    * */

    gboolean link_ok;
    GstCaps *caps;

    caps = gst_caps_new_simple ("video/x-raw",
    "format", G_TYPE_STRING, "NV12",
    "width", G_TYPE_INT, 800,
    "height", G_TYPE_INT, 600,
    "framerate", GST_TYPE_FRACTION, 5, 1,
    NULL);

    link_ok = gst_element_link_filtered (element1, element2, caps);
    gst_caps_unref (caps);

    if (!link_ok) {
    g_warning ("Failed to link element1 and element2!(v4l2src->convert)");
    }
    return link_ok;

    }


    void link_to_multiplexer(
    GstElement *tolink_element,
    GstElement *mux)
    {
    GstCaps *pad_caps =NULL;
    GstPad *pad;
    GstPad *tolink_pad;
    GstPadLinkReturn ret;

    tolink_pad = gst_element_get_static_pad(tolink_element, "src");
    pad_caps = gst_pad_query_caps(tolink_pad, NULL);
    pad = gst_element_get_compatible_pad(mux, tolink_pad, pad_caps);
    gst_caps_unref(pad_caps);

    ret = gst_pad_link(tolink_pad, pad);
    gst_object_unref(GST_OBJECT(pad));
    g_print("A new pad %s was created and linked to %s\n", gst_pad_get_name(tolink_pad), gst_pad_get_name(pad));
    }


    /* Main function does atual processing */

    int main (int argc , char *argv[])

    //gst-launch-1.0 -e videotestsrc ! 'video/x-raw, format=(string)NV12, width=(int)1280, height=(int)720, framerate=(fraction)30/1' ! vpe num-input-buffers=8 ! queue ! ducatih264enc bitrate=4000 ! queue ! h264parse ! qtmux ! filesink location=x.mov
    {
    GstElement *pipeline, *src, *convert, *q1, * enc, *q2, *parse, *q3, *mux, *sink;
    GstBus *bus;
    GstMessage *msg;


    gst_init (&argc, &argv);

    /* Create the empty pipeline */
    pipeline = gst_pipeline_new ("test-pipeline");

    /* Create the elements */
    src = gst_element_factory_make ("v4l2src", "video_source");
    g_object_set(G_OBJECT(src), "device", "/dev/video1", NULL);
    g_object_set(G_OBJECT(src), "io-mode", 4, NULL);
    g_object_set (G_OBJECT (src), "num-buffers", 50, NULL);
    convert = gst_element_factory_make ("vpe", "convert");
    g_object_set (G_OBJECT (src), "num-input-buffers", 8, NULL);
    q1 = gst_element_factory_make ("queue", "q1");
    enc = gst_element_factory_make ("ducatih264enc", "enc");
    q2 = gst_element_factory_make ("queue", "q2");
    parse = gst_element_factory_make ("h264parse", "parser");
    q3 = gst_element_factory_make ("queue", "q3");
    mux = gst_element_factory_make ("qtmux", "mux");
    sink = gst_element_factory_make ("filesink", "sink");
    g_object_set (G_OBJECT (sink), "location", "/home/root/x.mov", NULL);


    gst_bin_add_many (GST_BIN (pipeline), src, convert, q1, enc, q2, parse, q3, mux, sink, NULL);



    link_source_element_with_filter (src, convert);
    //gst_element_link_pads (src, "src", convert, "sink");
    gst_element_link_pads (convert, "src", q1, "sink");
    gst_element_link_pads (q1, "src", enc, "sink");
    gst_element_link_pads (enc, "src", parse, "sink");
    gst_element_link_pads (parse, "src", q3, "sink");
    link_to_multiplexer (q3, mux);
    gst_element_link_pads (mux, "src", sink, "sink");

    /* Start playing the pipeline */
    gst_element_set_state (pipeline, GST_STATE_PLAYING);

    /* Wait until error or EOS */
    bus = gst_element_get_bus (pipeline);
    msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);



    /* Free resources */
    if (msg != NULL)
    {
    gst_message_unref (msg);
    }
    gst_object_unref (bus);
    gst_element_set_state (pipeline, GST_STATE_NULL);

    gst_object_unref (pipeline);
    return 0;
    }


    BR
    Margarita

  • Hello,

    Thanks for your code! I tried your code,but it accured a problem :

    A new pad src was created and linked to video_0
    0:00:00.148453016 1536 0x148a90 WARN basesrc gstbasesrc.c:2948:gst_base_src_loop:<video_source> error: Internal data flow error.
    0:00:00.148537928 1536 0x148a90 WARN basesrc gstbasesrc.c:2948:gst_base_src_loop:<video_source> error: streaming task paused, reason not-negotiated (-4)

    And I fund the file x.mov,it was 0 bit.

    I have been troubled by the problem :Internal data flow error.Because neither could I find any information about what result  in this problem,nor could I find any solutions to this problem.

    So,could you help me if you know the way to solve the problem.If you have some information about the problem,could you share it to me?Thanks a lot!

    I searched the pad templates of the element "qtmux":

    Pad Templates:

    SINK template: 'video_%u'
    Availability: On request
    Has request_new_pad() function: gst_qt_mux_request_new_pad
    Capabilities:
    video/x-raw
    format: { RGB, UYVY, v210 }
    width: [ 16, 2147483647 ]
    height: [ 16, 2147483647 ]
    video/mpeg
    mpegversion: 4
    systemstream: false
    width: [ 16, 2147483647 ]
    height: [ 16, 2147483647 ]
    video/x-divx
    divxversion: 5
    width: [ 16, 2147483647 ]
    height: [ 16, 2147483647 ]
    video/x-prores
    variant: { standard, lt, hq, proxy }
    width: [ 16, 2147483647 ]
    height: [ 16, 2147483647 ]
    video/x-h263
    width: [ 16, 2147483647 ]
    height: [ 16, 2147483647 ]
    video/x-h264
    stream-format: avc
    alignment: au
    width: [ 16, 2147483647 ]
    height: [ 16, 2147483647 ]
    video/x-svq
    svqversion: 3
    width: [ 16, 2147483647 ]
    height: [ 16, 2147483647 ]
    video/x-dv
    systemstream: false
    width: [ 16, 2147483647 ]
    height: [ 16, 2147483647 ]
    image/jpeg
    width: [ 16, 2147483647 ]
    height: [ 16, 2147483647 ]
    video/x-vp8
    width: [ 16, 2147483647 ]
    height: [ 16, 2147483647 ]
    video/x-dirac
    width: [ 16, 2147483647 ]
    height: [ 16, 2147483647 ]
    video/x-qt-part
    width: [ 16, 2147483647 ]
    height: [ 16, 2147483647 ]

    And the message showed :A new pad src was created and linked to video_0.Does it means that the capability of qtmux's sink pad is :

    video/x-raw
    format: { RGB, UYVY, v210 }
    width: [ 16, 2147483647 ]
    height: [ 16, 2147483647 ]    ?

    However,the stream of my pipeline is h264 stream.This is just my guess.And I‘ll have a try to set the capability of qtmux's sink pad. 

    Thanks a lot!

  • Hello,

    As I said the code above you must change the resolution, framerate and format  in the link_source_element_with_filte function per your use case.

    The error :<video_source> error: streaming task paused, reason not-negotiated (-4) most of the cases means that something is wrong with the caps filter lika format, framerate etc.


    BR
    Margarita

  • Hello,
    I'm so sorry I took a mistake in understanding your reply.And I have tried it.It can capture and encode into the file.And the video can be decoded.
    Thanks a lot!
  • Hello,

    I want to ask you some question.I want to capture and encode the video one by one.I mean that after captureing the first video and encoding it, it will followthe next one,third one .......

    And the videos are different.So what can I do after the pipeline released?Could give me some advices?

  • Hello,

    I am sorry I have few questions to understand your question right:

    1.When the first video is finish do you want to destroy the pipeline and start it again for the second video?
    2.Or you want the pipeline to be in Playing state and you just want the "second video" to be save in different file(with different name)?
    3. Is the capture source the same for first, second etc. video?
    4. What you mean by videos are different is it by name or container or else?

    BR
    Margarita
  • Hello,
    I'm sorry that I didn't express my questions clearly.I want the pipeline be in the playing state and want the "second video" to be saved with differene name(Can I change the resolution and framerate of the video at the same time) .And I use the same camera to capture the video.
    If It could not achieve, can I destroy the pipeline, set the properties of the element ,and start it again for the next following videos?
    Could give me some advices?Thanks a lot!
  • Hello,

    You could unlink/link gstreamer elements when the pipeline is in playing state this is possible.
    But keep in mind that if you unlink/link element (last one)  frames will be lost between the both video this is caused by unlinking/linking.
    If you want to change the resolution of the capture source I do not think will work in this case you should destroy the pipeline.

    BR
    Margarita

  • In additional, you could check this tutorial:
    gstreamer.freedesktop.org/.../pipeline-manipulation.html

    BR
    Margarita
  • Hello,

    I used this command to capture and encode into a MP4 file from the website:. processors.wiki.ti.com/.../Processor_Training:_Multimedia

    gst-launch-1.0 -e v4l2src device=/dev/video1 num-buffers=1000 io-mode=4 ! 'video/x-raw,
    format=(string)YUY2, width=(int)1280, height=(int)720, framerate=(fraction)30/1' ! vpe num-input-buffers=8 !
    queue ! ducatimpeg4enc bitrate=4000 ! queue ! mpeg4videoparse ! qtmux ! filesink location=y.mp4

    But the properties of the Mp4 file showed that its  framerate was 23.And the bitrate of the file was 4058 kbps.

    However, when I used this command:

    gst-launch-1.0 videotestsrc num-buffers=500 ! 'video/x-raw,format=(string)YUY2, width=(int)1280, height=(int)720, framerate=(fraction)30/1' ! vpe num-input-buffers=8 ! queue ! ducatimpeg4enc bitrate=4000 ! queue ! mpeg4videoparse ! qtmux ! filesink location=x.mp4

    it worked fine and didn't have any problem of its  framerate and  bitrate:

    So,could u tell me the reason of the problem and give me some advice to solve this problem.

    Thanks&regards!

  • Hello,

    Are you sure that your video source could provides 1280x720 at 30 fps?
    Could you try with lower resolution at the same fps?

    BR
    Margarita
  • Hello,
    I'm sure that my camera can provide 1280x720 at 30 fps.
  • Hello,

    Could you try with:
    v4l2src device=/dev/video1 num-buffers=1000 io-mode=2...
    Let me know the result.

    BR
    Margarita
  • Hello,

    I tried that,but it did not work.And it showed :

  • Hello,

    What is the PSDK version that you are using?

    BR
    Margarita
  • Hello,
    I used the 4.0 sdk.
  • Hello,

    I have suspicious that this is not an encoder problem.
    However, in the gst h264 there is a memory leak issue which is already fixed please refer to this thread and cp the gst lib as is described in the thread and give a try.
    e2e.ti.com/.../620539

    I would recommend you to check this wiki page(Debugging chapter):
    processors.wiki.ti.com/.../Linux_Core_VIP_User's_Guide

    Since the pipeline with videotestsrc is working without a problem, please check is v4l2src dropping frames.

    Hope this helps.

    BR
    Margarita
  • Hello,

    Please check this answer:
    e2e.ti.com/.../2328878

    Hope this helps.

    BR
    Margarita