This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TDA4VM: How to set extra-controls for v4l2h264enc using Gstreamer API

Part Number: TDA4VM

Tool/software:

Hi TI Expert,

I'm implementing Gstreamer APP in SDK 9.2.

I can set extra-controls for v4l2h264enc using command line below:

gst-launch-1.0 v4l2src device=/dev/video5 ! capssetter caps="video/x-raw,width=1280,height=800,framerate=30/1,format=NV12,colorimetry=bt601" ! v4l2h264enc extra-controls="s,video_bitrate=1000000,h264_8x8_transform_enable=true,h264_entropy_mode=true,max_number_of_reference_pics=1" ! h264parse ! rtph264pay ! udpsink host=192.168.31.171 port=5001

But how should I set the extra-controls using Gstreamer API in C?

	// Method 1
	/*GstCaps *encoder_caps = gst_caps_from_string("s,video_bitrate=1000000,h264_8x8_transform_enable=true,h264_entropy_mode=true,max_number_of_reference_pics=1");
    g_object_set(encoder, "extra-controls", encoder_caps, NULL);
    gst_caps_unref(caps);*/
	
	// Method 2
    //g_object_set(encoder, "extra-controls", "s,video_bitrate=1000000,h264_8x8_transform_enable=true,h264_entropy_mode=true,max_number_of_reference_pics=1", NULL);
	
	//Method 3
    /*GstStructure *extra_controls = gst_structure_new_empty("extra-controls");
    gst_structure_set(extra_controls, "video_bitrate", G_TYPE_INT, 1000000,
                      "h264_8x8_transform_enable", G_TYPE_BOOLEAN, TRUE,
                      "h264_entropy_mode", G_TYPE_BOOLEAN, TRUE,
                      "max_number_of_reference_pics", G_TYPE_INT, 1, NULL);
    gchar *extra_controls_str = gst_structure_to_string(extra_controls);
    g_object_set(encoder, "extra-controls", extra_controls_str, NULL);
    g_free(extra_controls_str);
    gst_structure_free(extra_controls);*/

I've tried the 3 methods above, and all result in error:

[  120.878054] __vm_enough_memory: pid: 1238, comm: gst_pipeline, no enough memory for the allocation
[  120.887301] __vm_enough_memory: pid: 1238, comm: gst_pipeline, no enough memory for the allocation
[  120.896359] __vm_enough_memory: pid: 1238, comm: gst_pipeline, no enough memory for the allocation

(gst_pipeline:1238): GLib-ERROR **: 07:31:29.226: ../glib-2.72.3/glib/gmem.c:161: failed to allocate 28030946600 bytes
[  120.906274] audit: type=1701 audit(1717486289.223:19): auid=4294967295 uid=0 gid=0 ses=4294967295 pid=1238 comm="gst_pipeline" exe="/home/root/gst_pipeline" sig=5 res=1
[  120.950272] audit: type=1334 audit(1717486289.267:20): prog-id=15 op=LOAD
[  120.957484] audit: type=1334 audit(1717486289.275:21): prog-id=16 op=LOAD
Trace/breakpoint trap (core dumped)

I also attach my test code for your reference

#include <gst/gst.h>

int main(int argc, char *argv[]) {
    GstElement *pipeline, *source, *capssetter, *encoder, *parser, *payloader, *sink;
    GstCaps *caps;
    GError *error = NULL;
    GstStateChangeReturn ret;
	GstStructure *extra_controls;
    gchar *extra_controls_str;
	
    // Initialize GStreamer
    gst_init(&argc, &argv);

    // Create elements
    pipeline = gst_pipeline_new("video-pipeline");
    source = gst_element_factory_make("v4l2src", "source");
    capssetter = gst_element_factory_make("capssetter", "capssetter");
    encoder = gst_element_factory_make("v4l2h264enc", "encoder");
    parser = gst_element_factory_make("h264parse", "parser");
    payloader = gst_element_factory_make("rtph264pay", "payloader");
    sink = gst_element_factory_make("udpsink", "sink");

    if (!pipeline || !source || !capssetter || !encoder || !parser || !payloader || !sink) {
        g_printerr("Not all elements could be created.\n");
        return -1;
    }

    // Set element properties
    g_object_set(source, "device", "/dev/video5", NULL);
    caps = gst_caps_from_string("video/x-raw,width=1280,height=800,framerate=30/1,format=NV12,colorimetry=bt601");
    g_object_set(capssetter, "caps", caps, NULL);
    gst_caps_unref(caps);

	// Method 1
	/*GstCaps *encoder_caps = gst_caps_from_string("s,video_bitrate=1000000,h264_8x8_transform_enable=true,h264_entropy_mode=true,max_number_of_reference_pics=1");
    g_object_set(encoder, "extra-controls", encoder_caps, NULL);
    gst_caps_unref(caps);*/
	
	// Method 2
    //g_object_set(encoder, "extra-controls", "s,video_bitrate=1000000,h264_8x8_transform_enable=true,h264_entropy_mode=true,max_number_of_reference_pics=1", NULL);
	
	//Method 3
    /*extra_controls = gst_structure_new_empty("extra-controls");
    gst_structure_set(extra_controls, "video_bitrate", G_TYPE_INT, 1000000,
                      "h264_8x8_transform_enable", G_TYPE_BOOLEAN, TRUE,
                      "h264_entropy_mode", G_TYPE_BOOLEAN, TRUE,
                      "max_number_of_reference_pics", G_TYPE_INT, 1, NULL);
    extra_controls_str = gst_structure_to_string(extra_controls);
    g_object_set(encoder, "extra-controls", extra_controls_str, NULL);
    g_free(extra_controls_str);
    gst_structure_free(extra_controls);*/

    g_object_set(sink, "host", "192.168.31.171", "port", 5001, NULL);

    // Build the pipeline
    gst_bin_add_many(GST_BIN(pipeline), source, capssetter, encoder, parser, payloader, sink, NULL);
    if (gst_element_link_many(source, capssetter, encoder, parser, payloader, sink, NULL) != TRUE) {
        g_printerr("Elements could not be linked.\n");
        gst_object_unref(pipeline);
        return -1;
    }

    // Start playing
    ret = gst_element_set_state(pipeline, GST_STATE_PLAYING);
    if (ret == GST_STATE_CHANGE_FAILURE) {
        g_printerr("Unable to set the pipeline to the playing state.\n");
        gst_object_unref(pipeline);
        return -1;
    }

    // Wait until error or EOS
    GstBus *bus = gst_element_get_bus(pipeline);
    GstMessage *msg = gst_bus_timed_pop_filtered(bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);

    // Parse message
    if (msg != NULL) {
        GError *err;
        gchar *debug_info;

        switch (GST_MESSAGE_TYPE(msg)) {
            case GST_MESSAGE_ERROR:
                gst_message_parse_error(msg, &err, &debug_info);
                g_printerr("Error received from element %s: %s\n", GST_OBJECT_NAME(msg->src), err->message);
                g_printerr("Debugging information: %s\n", debug_info ? debug_info : "none");
                g_clear_error(&err);
                g_free(debug_info);
                break;
            case GST_MESSAGE_EOS:
                g_print("End-Of-Stream reached.\n");
                break;
            default:
                // Should not reach here
                g_printerr("Unexpected message received.\n");
                break;
        }
        gst_message_unref(msg);
    }

    // Free resources
    gst_object_unref(bus);
    gst_element_set_state(pipeline, GST_STATE_NULL);
    gst_object_unref(pipeline);
    return 0;
}

Thanks,

Josh

  • Hi Josh,

    You can reference the app-multi-cam-codec demo that is a part of the Vision Apps SDK (Linux+RTOS). You can find "Getting Started" steps here. The path for this demo app is located in ti-processor-sdk-rtos-j721e-evm-<sdk_version>/vision_apps/apps/basic_demos/app_multi_cam_codec. 

    In the main file you can see how we utilize the GStreamer API in C to run a camera capture, encode, decode, and display process.

    BR,
    Sarabesh S.