This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
Hello,
I have some query regarding decoder and VPE. I have listdown it as below:
1). What will be the output image-format of ducati decoder?
2). If i want to convert decoded format to BGR24 usng VPE. How can i do that?
Regards,
Kishan Patel.
kishan patel14 said:1). What will be the output image-format of ducati decoder?r??
kishan patel14 said:2). If i want to convert decoded format to BGR24 usng VPE. How can i do that?
Hello,
Okay.
1. So first you could try testvpem2m there is an example of just scaling/colorspace-converting a progressive 640x480 nv12 clip to a smaller resolution rgb clip:
$ ./testvpem2m 640_480p.nv12 640 480 nv12 360_240p.rgb24 360 240 rgb24 0 3
You could find more information regarding this example here:
software-dl.ti.com/.../Foundational_Components_Kernel_Drivers.html
2. Here is e2e thread that you could check :
e2e.ti.com/.../660340
This thread is for capturevpedisplay demo and how to add rgb24 support to this demo.
3. Gstreamer case this support could be add since the default gstreamer VPE element does not support this.
Hope this informations helps you.
BR
Margarita
Hello,
You asked how you could test nv12->rgb via VPE. test-v4l2-m2m is just a demo to test VPE. You could try to check the new .rgb file on PC if you want.
You could also try capturevpedisplay demo. This demo is under /usr/bin
But first please check this thread, Manisha's answer.
https://e2e.ti.com/support/arm/sitara_arm/f/791/t/660340
and apply changes.
Here is how you could build this demo:
MACHINE=am57xx-evm bitbake omapdrmtest
You could try to use gsreamer vpe plugin but at this moment this plugin does not support rgb on the output. But since the VPE low level driver support it, it is possible to add this support for gstreamer vpe plugin.
Here is the command how you could build gstreamer vpe plugin :
MACHINE=am57xx-evm bitbake gstreamer1.0-plugins-vpe
If you make change in the code you must rebuild and cp to the board. Here is the command for this:
MACHINE=am57xx-evm bitbake <recipe> --force -c compile
Hope this helps.
BR
Margarita
Hello,
As I said gstreamer vpe element does not support rgb on his output. But You could add this support.
Best Regards,
Margarita
Hello,
kishan patel14 said:So, thats why are you saying that BGR is not supported?
you must add this support in the gstvpe code!
Look into the gstvpe.c file, there you will see that :
static GstStaticPadTemplate src_factory = GST_STATIC_PAD_TEMPLATE ("src",
GST_PAD_SRC,
GST_PAD_ALWAYS,
GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE ("NV12")
";" GST_VIDEO_CAPS_MAKE ("YUYV")
";" GST_VIDEO_CAPS_MAKE ("YUY2")));
There is no RGB.
But you could make changes in gstvpe.c file to support this.
On first look you must add in this function static GstStaticPadTemplate src_factory RGB.
In this one gst_vpe_fourcc_to_pixelformat also.
Check these function gst_vpe_set_output_caps.
In gstvpebuffer.c check gst_vpe_buffer_new and gst_vpe_buffer_priv functions.
I would recommend you to check how the other formats are added and to dig deeper in gstreamer vpe element.
Hope this helps.
BR
Margarita
Hello,
This is the path :
/tisdk/build/arago-tmp-external-linaro-toolchain/work/am57xx_evm-linux-gnueabi/gstreamer1.0-plugins-vpe/git-rx.xx/git/src
Please, I already linked this in some of previous posts.
BR
Margarita
Hello,
The source code is on your PC.
You should not install gstreamer-vpe plugin since it is already included in PSDK.
Please, find the source code in /tisdk/build/arago-tmp-external-linaro-toolchain/work/am57xx_evm-linux-gnueabi/gstreamer1.0-plugins-vpe/git-rx.xx/git/src
When you make your changes you must rebuild the plugin by using this command:
MACHINE=am57xx-evm bitbake gstreamer1.0-plugins-vpe --force -c compile
The new gstvpe lib you could find here:
/tisdk/build/arago-tmp-external-linaro-toolchain/work/am57xx_evm-linux-gnueabi/gstreamer1.0-plugins-vpe/git-r2.19/git/src/.libs
The name is libgstvpe.so.
If you could not find .libs folder please press ctrl+h to see it.
After that you must copy the new libgstvpe.so lib on the board under this path /usr/lib/gstreamer-1.0
You could test this with simple pipeline - gst-launch-1.0 videotestsrc ! <capsfilter> ! vpe ! <capsfilter> ! fakesink silent=false -e -v
BR
Margarita
/* * GStreamer * Copyright (c) 2014, Texas Instruments Incorporated * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation * version 2.1 of the License. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA */ #ifdef HAVE_CONFIG_H #include <config.h> #endif #include "gstvpe.h" #include <stdio.h> #include <stdlib.h> #include <fcntl.h> #include <unistd.h> #include <stdint.h> #include <string.h> #include <errno.h> #include <pthread.h> #include <sys/mman.h> #include <sys/ioctl.h> #include <libdce.h> #include <sched.h> #include <math.h> #ifndef MIN #define MIN(a,b) (((a) < (b)) ? (a) : (b)) #endif #define ADD_GST_VPE_RGB_SUPPORT //Kishan static void gst_vpe_class_init (GstVpeClass * klass); static void gst_vpe_init (GstVpe * self, gpointer klass); static void gst_vpe_base_init (gpointer gclass); static GstElementClass *parent_class = NULL; static gboolean gst_vpe_set_output_caps (GstVpe * self); /*------------------------Kishan-----------------------*/ #ifdef ADD_GST_VPE_RGB_SUPPORT static int gst_vpe_video_format_from_fourcc (guint32 fourcc) { switch (fourcc) { case GST_MAKE_FOURCC ('R', 'G', 'B','\0'): return GST_VIDEO_FORMAT_RGB; default: return gst_video_format_from_fourcc(fourcc); } return GST_VIDEO_FORMAT_UNKNOWN; } #endif /*-----------------------------------------------------*/ GType gst_vpe_get_type (void) { static GType vpe_type = 0; if (!vpe_type) { static const GTypeInfo vpe_info = { sizeof (GstVpeClass), (GBaseInitFunc) gst_vpe_base_init, NULL, (GClassInitFunc) gst_vpe_class_init, NULL, NULL, sizeof (GstVpe), 0, (GInstanceInitFunc) gst_vpe_init, }; vpe_type = g_type_register_static (GST_TYPE_ELEMENT, "GstVpe", &vpe_info, 0); } return vpe_type; } static GstStaticPadTemplate src_factory = GST_STATIC_PAD_TEMPLATE ("src", GST_PAD_SRC, GST_PAD_ALWAYS, // GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE ("{ RGB, NV12, YUYV, YUY2}"))); GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE ("NV12") ";" GST_VIDEO_CAPS_MAKE ("YUYV") #ifdef ADD_GST_VPE_RGB_SUPPORT ";" GST_VIDEO_CAPS_MAKE ("BGR3") ";" GST_VIDEO_CAPS_MAKE ("RGB3") ";" GST_VIDEO_CAPS_MAKE ("BGR") ";" GST_VIDEO_CAPS_MAKE ("RGB") #endif ";" GST_VIDEO_CAPS_MAKE ("YUY2"))); static GstStaticPadTemplate sink_factory = GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, // GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE ("{ RGB, NV12, YUYV, YUY2}"))); GST_STATIC_CAPS (GST_VIDEO_CAPS_MAKE ("NV12") ";" GST_VIDEO_CAPS_MAKE ("YUYV") #ifdef ADD_GST_VPE_RGB_SUPPORT ";" GST_VIDEO_CAPS_MAKE ("BGR3") ";" GST_VIDEO_CAPS_MAKE ("RGB3") ";" GST_VIDEO_CAPS_MAKE ("BGR") ";" GST_VIDEO_CAPS_MAKE ("RGB") #endif ";" GST_VIDEO_CAPS_MAKE ("YUY2"))); enum { PROP_0, PROP_NUM_INPUT_BUFFERS, PROP_NUM_OUTPUT_BUFFERS, PROP_DEVICE }; #define MAX_NUM_OUTBUFS 16 #define MAX_NUM_INBUFS 128 #define DEFAULT_NUM_OUTBUFS 6 #define DEFAULT_NUM_INBUFS 12 #define DEFAULT_DEVICE "/dev/v4l/by-path/platform-489d0000.vpe-video-index0" static gboolean gst_vpe_parse_input_caps (GstVpe * self, GstCaps * input_caps) { gboolean match; GstStructure *s; gint w, h; guint32 fourcc = 0; const gchar *fmt = NULL; GST_DEBUG_OBJECT (self, "Input caps: %s", gst_caps_to_string (input_caps)); if (self->input_caps) { match = gst_caps_is_strictly_equal (self->input_caps, input_caps); GST_DEBUG_OBJECT (self, "Already set caps comapred with the new caps, returned %s", (match == TRUE) ? "TRUE" : "FALSE"); if (match == TRUE) return TRUE; } s = gst_caps_get_structure (input_caps, 0); fmt = gst_structure_get_string (s, "format"); if (!fmt) { return FALSE; } fourcc = GST_STR_FOURCC (fmt); /* For interlaced streams, ducati decoder sets caps without the interlaced * at first, and then changes it to set it as true or false, so if interlaced * is false, we cannot assume that the stream is pass-through */ self->interlaced = FALSE; gst_structure_get_boolean (s, "interlaced", &self->interlaced); gst_structure_get_int (s, "max-ref-frames", &self->input_max_ref_frames); if (!(gst_structure_get_int (s, "width", &w) && gst_structure_get_int (s, "height", &h))) { return FALSE; } if (self->input_width != 0 && (self->input_width != w || self->input_height != h)) { GST_DEBUG_OBJECT (self, "dynamic changes in height and width are not supported"); return FALSE; } self->input_height = h; self->input_width = w; self->input_fourcc = fourcc; /* Keep a copy of input caps */ if (self->input_caps) gst_caps_unref (self->input_caps); self->input_caps = gst_caps_copy (input_caps); return TRUE; } static gboolean gst_vpe_set_output_caps (GstVpe * self) { GstCaps *outcaps; GstStructure *s, *out_s; gint fps_n, fps_d; gint par_width, par_height; const gchar *fmt = NULL; if (!self->input_caps) return FALSE; if (self->fixed_caps) return TRUE; s = gst_caps_get_structure (self->input_caps, 0); outcaps = gst_pad_get_allowed_caps (self->srcpad); if (outcaps && !(self->output_caps && gst_caps_is_strictly_equal (outcaps, self->output_caps))) { GST_DEBUG_OBJECT (self, "Downstream allowed caps: %s", gst_caps_to_string (outcaps)); out_s = gst_caps_get_structure (outcaps, 0); fmt = gst_structure_get_string (out_s, "format"); if (out_s && gst_structure_get_int (out_s, "width", &self->output_width) && gst_structure_get_int (out_s, "height", &self->output_height) && fmt) { self->output_fourcc = GST_STR_FOURCC (fmt); GST_DEBUG_OBJECT (self, "Using downstream caps, fixed_caps = TRUE"); self->fixed_caps = TRUE; self->output_framerate_d = self->output_framerate_n = 0; if (gst_structure_get_fraction (out_s, "framerate", &fps_n, &fps_d)) { self->output_framerate_d = fps_d; self->output_framerate_n = fps_n; } } } /*------------------------Kishan------------------------*/ #ifdef ADD_GST_VPE_RGB_SUPPORT else { if (self->output_caps) { out_s = gst_caps_get_structure (self->output_caps, 0); fmt = gst_structure_get_string (out_s, "format"); } } #endif /*------------------------------------------------------*/ if (!self->fixed_caps) { if (self->input_crop.c.width && self->interlaced) { /* Ducati decoder had the habit of setting height as half frame hight for * interlaced streams */ self->output_height = (self->interlaced) ? self->input_crop.c.height * 2 : self->input_crop.c.height; self->output_width = self->input_crop.c.width; } else if(self->input_crop.c.width) { /* This is added to make sure that input and output width/height won't match and vpe is not is in passthrough mode. This code is added as a workaround for playback with sub-title usecase */ self->output_height = ALIGN2(self->input_crop.c.height, 1); self->output_width = ALIGN2(self->input_crop.c.width, 4); } else { self->output_height = self->input_height; self->output_width = self->input_width; } /*------------------------Kishan------------------------*/ #ifndef ADD_GST_VPE_RGB_SUPPORT self->output_fourcc = GST_MAKE_FOURCC ('N', 'V', '1', '2'); #else if (fmt) self->output_fourcc = GST_STR_FOURCC (fmt); else self->output_fourcc = GST_MAKE_FOURCC ('N', 'V', '1', '2'); #endif /*------------------------------------------------------*/ } self->passthrough = !(self->interlaced || self->output_width != self->input_width || self->output_height != self->input_height || self->output_fourcc != self->input_fourcc); GST_DEBUG_OBJECT (self, "Passthrough = %s", self->passthrough ? "TRUE" : "FALSE"); gst_caps_unref (outcaps); outcaps = gst_caps_new_simple ("video/x-raw", "format", G_TYPE_STRING, gst_video_format_to_string #ifndef ADD_GST_VPE_RGB_SUPPORT (gst_video_format_from_fourcc (self->output_fourcc)), NULL); #else (gst_vpe_video_format_from_fourcc (self->output_fourcc)), NULL); //Kishan #endif out_s = gst_caps_get_structure (outcaps, 0); gst_structure_set (out_s, "width", G_TYPE_INT, self->output_width, "height", G_TYPE_INT, self->output_height, NULL); /* Set interlace-mode in the outcaps to match the capabilities with videoscale element when playbin is used.if not set, buffer copy happens due to mismatch */ gst_structure_set (out_s, "interlace-mode", G_TYPE_STRING, "progressive", NULL); if (gst_structure_get_fraction (s, "pixel-aspect-ratio", &par_width, &par_height)) gst_structure_set (out_s, "pixel-aspect-ratio", GST_TYPE_FRACTION, par_width, par_height, NULL); self->input_framerate_d = self->input_framerate_n = 0; if (gst_structure_get_fraction (s, "framerate", &fps_n, &fps_d)) { self->input_framerate_d = fps_d; self->input_framerate_n = fps_n; } if (self->output_framerate_d == 0) { self->output_framerate_d = self->input_framerate_d; self->output_framerate_n = self->input_framerate_n; } if (self->output_framerate_d != 0) { gst_structure_set (out_s, "framerate", GST_TYPE_FRACTION, self->output_framerate_n, self->output_framerate_d, NULL); } if (self->output_caps) gst_caps_unref (self->output_caps); self->output_caps = outcaps; self->output_repeat_rate = 1; if (self->output_framerate_n != 0 && self->output_framerate_d != 0 && self->input_framerate_n != 0 && self->input_framerate_d != 0) { self->output_repeat_rate = (gint) round (((double) self->output_framerate_n * (double) self->input_framerate_d) / ((double) self->input_framerate_n * (double) self->output_framerate_d)); } GST_DEBUG_OBJECT (self, "framerate conversion: from %d/%d to %d/%d, repeat_factor: %d\n", self->input_framerate_n, self->input_framerate_d, self->output_framerate_n, self->output_framerate_d, self->output_repeat_rate); return TRUE; } static gboolean gst_vpe_init_output_buffers (GstVpe * self) { int i; GstBuffer *buf; if (!self->output_caps) { GST_DEBUG_OBJECT (self, "Output caps should be set before init output buffer"); return FALSE; } self->output_pool = gst_vpe_buffer_pool_new (TRUE, self->num_output_buffers, self->num_output_buffers, V4L2_BUF_TYPE_VIDEO_CAPTURE_MPLANE, self->output_caps, NULL, NULL); if (!self->output_pool) { return FALSE; } self->output_pool->format = &self->output_format; for (i = 0; i < self->num_output_buffers; i++) { buf = gst_vpe_buffer_new (self->output_pool, self->dev, self->output_fourcc, self->output_width, self->output_height, i, V4L2_BUF_TYPE_VIDEO_CAPTURE_MPLANE); if (!buf) { return FALSE; } gst_vpe_buffer_pool_put (self->output_pool, buf); /* gst_vpe_buffer_pool_put keeps a reference of the buffer, * so, unref ours gst_buffer_unref (GST_BUFFER (buf)); */ } return TRUE; } static int gst_vpe_fourcc_to_pixelformat (guint32 fourcc) { switch (fourcc) { case GST_MAKE_FOURCC ('Y', 'U', 'Y', '2'): case GST_MAKE_FOURCC ('Y', 'U', 'Y', 'V'): return V4L2_PIX_FMT_YUYV; case GST_MAKE_FOURCC ('N', 'V', '1', '2'): return V4L2_PIX_FMT_NV12; #ifdef ADD_GST_VPE_RGB_SUPPORT case GST_MAKE_FOURCC ('B', 'G', 'R', '3'): return V4L2_PIX_FMT_BGR24; case GST_MAKE_FOURCC ('R', 'G', 'B', '3'): case GST_MAKE_FOURCC ('R', 'G', 'B', '\0'): return V4L2_PIX_FMT_RGB24; #endif } return -1; } static gboolean gst_vpe_output_set_fmt (GstVpe * self) { struct v4l2_format fmt; int ret; // V4L2 Stuff bzero (&fmt, sizeof (fmt)); fmt.type = V4L2_BUF_TYPE_VIDEO_CAPTURE_MPLANE; fmt.fmt.pix_mp.width = self->output_width; fmt.fmt.pix_mp.height = self->output_height; fmt.fmt.pix_mp.pixelformat = gst_vpe_fourcc_to_pixelformat (self->output_fourcc); fmt.fmt.pix_mp.field = V4L2_FIELD_ANY; fmt.fmt.pix_mp.num_planes = 1; GST_DEBUG_OBJECT (self, "vpe: output S_FMT image: %dx%d", fmt.fmt.pix_mp.width, fmt.fmt.pix_mp.height); ret = ioctl (self->video_fd, VIDIOC_S_FMT, &fmt); if (ret < 0) { GST_ERROR_OBJECT (self, "VIDIOC_S_FMT failed"); return FALSE; } else { GST_DEBUG_OBJECT (self, "sizeimage[0] = %d, sizeimage[1] = %d", fmt.fmt.pix_mp.plane_fmt[0].sizeimage, fmt.fmt.pix_mp.plane_fmt[1].sizeimage); /* Save a copy of the current format for this pool */ self->output_format = fmt; } return TRUE; } static GstBuffer * gst_vpe_alloc_inputbuffer (void *ctx, int index) { GstVpe *self = (GstVpe *) ctx; return gst_vpe_buffer_new (self->input_pool, self->dev, self->input_fourcc, self->input_width, self->input_height, index, V4L2_BUF_TYPE_VIDEO_OUTPUT_MPLANE); } static gboolean gst_vpe_init_input_buffers (GstVpe * self, gint min_num_input_buffers) { int i; GstBuffer *buf; if (!self->input_caps) { GST_DEBUG_OBJECT (self, "Input caps should be set before init input buffer"); return FALSE; } self->input_pool = gst_vpe_buffer_pool_new (FALSE, MAX_NUM_INBUFS, min_num_input_buffers, V4L2_BUF_TYPE_VIDEO_OUTPUT_MPLANE, self->input_caps, gst_vpe_alloc_inputbuffer, self); if (!self->input_pool) { return FALSE; } self->input_pool->format = &self->input_format; return TRUE; } static gboolean gst_vpe_input_set_fmt (GstVpe * self) { struct v4l2_format fmt; int ret; struct v4l2_selection sel = { .type = V4L2_BUF_TYPE_VIDEO_OUTPUT, .target = V4L2_SEL_TGT_CROP_DEFAULT, }; struct v4l2_rect r; // V4L2 Stuff bzero (&fmt, sizeof (fmt)); fmt.type = V4L2_BUF_TYPE_VIDEO_OUTPUT_MPLANE; fmt.fmt.pix_mp.width = self->input_width; fmt.fmt.pix_mp.pixelformat = gst_vpe_fourcc_to_pixelformat (self->input_fourcc); if (self->interlaced) { fmt.fmt.pix_mp.height = (self->input_height); fmt.fmt.pix_mp.field = V4L2_FIELD_SEQ_TB; } else { fmt.fmt.pix_mp.height = self->input_height; fmt.fmt.pix_mp.field = V4L2_FIELD_ANY; } fmt.fmt.pix_mp.num_planes = 1; GST_DEBUG_OBJECT (self, "input S_FMT field: %d, image: %dx%d, numbufs: %d", fmt.fmt.pix_mp.field, fmt.fmt.pix_mp.width, fmt.fmt.pix_mp.height, self->num_input_buffers); ret = ioctl (self->video_fd, VIDIOC_S_FMT, &fmt); if (ret < 0) { GST_ERROR_OBJECT (self, "VIDIOC_S_FMT failed"); return FALSE; } else { GST_DEBUG_OBJECT (self, "sizeimage[0] = %d, sizeimage[1] = %d", fmt.fmt.pix_mp.plane_fmt[0].sizeimage, fmt.fmt.pix_mp.plane_fmt[1].sizeimage); /* Save a copy of the current format for this pool */ self->input_format = fmt; } if (self->input_crop.c.width != 0) { GST_DEBUG_OBJECT (self, "Crop values: top: %d, left: %d, width: %d, height: %d", self->input_crop.c.top, self->input_crop.c.left, self->input_crop.c.width, self->input_crop.c.height); ret = ioctl (self->video_fd, VIDIOC_G_SELECTION, &sel); if (ret < 0) { GST_ERROR_OBJECT (self, "VIDIOC_G_SELECTION for crop failed"); return FALSE; } sel.target = V4L2_SEL_TGT_CROP; r.width = self->input_crop.c.width; r.height = self->input_crop.c.height; r.left = self->input_crop.c.left; r.top = self->input_crop.c.top; sel.r = r; ret = ioctl (self->video_fd, VIDIOC_S_SELECTION, &sel); if (ret < 0) { GST_ERROR_OBJECT (self, "VIDIOC_S_SELECTION for crop failed"); return FALSE; } } return TRUE; } static void gst_vpe_dequeue_loop (gpointer data) { GstVpe *self = (GstVpe *) data; GstBuffer *buf, *b; gint q_cnt; while (1) { buf = NULL; GST_OBJECT_LOCK (self); if (self->video_fd >= 0 && self->output_pool) (void) gst_buffer_pool_acquire_buffer (GST_BUFFER_POOL (self->output_pool), &buf, NULL); if (buf) { self->output_q_processing--; g_assert (self->output_q_processing >= 0); } GST_OBJECT_UNLOCK (self); if (buf) { for (q_cnt = 1; q_cnt < self->output_repeat_rate; q_cnt++) { b = gst_vpe_buffer_ref (self->output_pool, buf); if (b) gst_pad_push (self->srcpad, GST_BUFFER (b)); } GST_DEBUG_OBJECT (self, "push: %" GST_TIME_FORMAT " (ptr %p)", GST_TIME_ARGS (GST_BUFFER_PTS (buf)), buf); gst_pad_push (self->srcpad, GST_BUFFER (buf)); } else break; } GST_OBJECT_LOCK (self); if (self->video_fd >= 0 && self->input_pool) { while (NULL != (buf = gst_vpe_buffer_pool_dequeue (self->input_pool))) { self->input_q_depth--; g_assert (self->input_q_depth >= 0); gst_buffer_unref (buf); } while (((MAX_INPUT_Q_DEPTH - self->input_q_depth) >= 1) && (NULL != (buf = (GstBuffer *) g_queue_pop_head (&self->input_q))) && (TRUE == gst_vpe_buffer_pool_queue (self->input_pool, buf, &q_cnt))) { self->input_q_depth += q_cnt; if (self->interlaced) { self->output_q_processing += q_cnt * 2; } else { self->output_q_processing += q_cnt; } } } GST_OBJECT_UNLOCK (self); usleep (10000); } static void gst_vpe_print_driver_capabilities (GstVpe * self) { struct v4l2_capability cap; if (0 == ioctl (self->video_fd, VIDIOC_QUERYCAP, &cap)) { GST_DEBUG_OBJECT (self, "driver: '%s'", cap.driver); GST_DEBUG_OBJECT (self, "card: '%s'", cap.card); GST_DEBUG_OBJECT (self, "bus_info: '%s'", cap.bus_info); GST_DEBUG_OBJECT (self, "version: %08x", cap.version); GST_DEBUG_OBJECT (self, "capabilites: %08x", cap.capabilities); } else { GST_WARNING_OBJECT (self, "Cannot get V4L2 driver capabilites!"); } } static gboolean gst_vpe_create (GstVpe * self) { if (self->dev == NULL) { self->dev = dce_init (); if (self->dev == NULL) { GST_ERROR_OBJECT (self, "dce_init() failed"); return FALSE; } GST_DEBUG_OBJECT (self, "dce_init() done"); } return TRUE; } static gboolean gst_vpe_init_input_bufs (GstVpe * self, GstCaps * input_caps) { gint min_num_input_buffers; if (!gst_vpe_create (self)) { return FALSE; } if (input_caps && !gst_vpe_parse_input_caps (self, input_caps)) { GST_ERROR_OBJECT (self, "Could not parse/set caps"); return FALSE; } if (self->num_input_buffers) { min_num_input_buffers = self->num_input_buffers; } else if (self->input_max_ref_frames) { min_num_input_buffers = MAX (6, self->input_max_ref_frames); } else { min_num_input_buffers = DEFAULT_NUM_INBUFS; } if (min_num_input_buffers > MAX_NUM_INBUFS) min_num_input_buffers = MAX_NUM_INBUFS; GST_DEBUG_OBJECT (self, "Using min input buffers: %d", min_num_input_buffers); GST_DEBUG_OBJECT (self, "parse/set caps done"); if (self->input_pool == NULL) { if (!gst_vpe_init_input_buffers (self, min_num_input_buffers)) { GST_ERROR_OBJECT (self, "gst_vpe_init_input_buffers failed"); return FALSE; } GST_DEBUG_OBJECT (self, "gst_vpe_init_input_buffers done"); } else { gst_vpe_buffer_pool_set_min_buffer_count (self->input_pool, min_num_input_buffers); } return TRUE; } static void gst_vpe_set_streaming (GstVpe * self, gboolean streaming) { gboolean ret; GstBuffer *buf; if (streaming) { if (self->video_fd < 0) { GST_DEBUG_OBJECT (self, "Calling open(%s)", self->device); self->video_fd = open (self->device, O_RDWR | O_NONBLOCK); if (self->video_fd < 0) { GST_ERROR_OBJECT (self, "Cant open %s", self->device); return; } GST_DEBUG_OBJECT (self, "Opened %s", self->device); gst_vpe_print_driver_capabilities (self); /* Call V4L2 S_FMT for input and output */ gst_vpe_input_set_fmt (self); gst_vpe_output_set_fmt (self); if (!gst_vpe_init_input_bufs (self, NULL)) { GST_ERROR_OBJECT (self, "gst_vpe_init_input_bufs failed"); } if (self->input_pool) gst_vpe_buffer_pool_set_streaming (self->input_pool, self->video_fd, streaming, self->interlaced); self->output_q_processing = 0; if (!self->output_pool) { if (!gst_vpe_init_output_buffers (self)) { GST_ERROR_OBJECT (self, "gst_vpe_init_output_buffers failed"); } GST_DEBUG_OBJECT (self, "gst_vpe_init_output_buffers done"); } if (self->output_pool) gst_vpe_buffer_pool_set_streaming (self->output_pool, self->video_fd, streaming, FALSE); self->input_q_depth = 0; } else { GST_DEBUG_OBJECT (self, "streaming already on"); } } else { if (self->video_fd >= 0) { while (NULL != (buf = (GstBuffer *) g_queue_pop_head (&self->input_q))) { gst_buffer_unref (buf); } if (self->input_pool) gst_vpe_buffer_pool_set_streaming (self->input_pool, self->video_fd, streaming, self->interlaced); if (self->output_pool) gst_vpe_buffer_pool_set_streaming (self->output_pool, self->video_fd, streaming, FALSE); close (self->video_fd); self->video_fd = -1; } else { GST_DEBUG_OBJECT (self, "streaming already off"); } } } static gboolean gst_vpe_start (GstVpe * self, GstCaps * input_caps) { if (!gst_vpe_init_input_bufs (self, input_caps)) { GST_ERROR_OBJECT (self, "gst_vpe_init_input_bufs failed"); return FALSE; } if (!self->output_pool) { if (!gst_vpe_set_output_caps (self)) { GST_ERROR_OBJECT (self, "gst_vpe_set_output_caps failed"); return FALSE; } } self->state = GST_VPE_ST_ACTIVE; return TRUE; } static void gst_vpe_destroy (GstVpe * self) { gst_vpe_set_streaming (self, FALSE); if (self->input_caps) gst_caps_unref (self->input_caps); self->input_caps = NULL; if (self->output_caps) gst_caps_unref (self->output_caps); self->output_caps = NULL; self->fixed_caps = FALSE; if (self->input_pool) { gst_vpe_buffer_pool_destroy (self->input_pool); GST_DEBUG_OBJECT (self, "gst_vpe_buffer_pool_destroy(input) done"); } self->input_pool = NULL; if (self->output_pool) { gst_vpe_buffer_pool_destroy (self->output_pool); GST_DEBUG_OBJECT (self, "gst_vpe_buffer_pool_destroy(output) done"); } self->output_pool = NULL; if (self->video_fd >= 0) close (self->video_fd); self->video_fd = -1; if (self->dev) dce_deinit (self->dev); GST_DEBUG_OBJECT (self, "dce_deinit done"); gst_segment_init (&self->segment, GST_FORMAT_UNDEFINED); self->dev = NULL; self->input_width = 0; self->input_height = 0; self->input_max_ref_frames = 0; self->output_width = 0; self->output_height = 0; self->input_crop.c.top = 0; self->input_crop.c.left = 0; self->input_crop.c.width = 0; self->input_crop.c.height = 0; self->output_framerate_d = 0; self->output_repeat_rate = 1; if (self->device) g_free (self->device); self->device = NULL; } static gboolean gst_vpe_activate_mode (GstPad * pad, GstObject * parent, GstPadMode mode, gboolean active) { if (mode == GST_PAD_MODE_PUSH) { gboolean result = TRUE; GstVpe *self; self = GST_VPE (parent); GST_DEBUG_OBJECT (self, "gst_vpe_activate_mode (active = %d)", active); if (!active) { result = gst_pad_stop_task (self->srcpad); GST_DEBUG_OBJECT (self, "task gst_vpe_dequeue_loop stopped"); } else { result = gst_pad_start_task (self->srcpad, gst_vpe_dequeue_loop, self, NULL); GST_DEBUG_OBJECT (self, "gst_pad_start_task returned %d", result); } return result; } return FALSE; } static gboolean gst_vpe_sink_setcaps (GstPad * pad, GstCaps * caps) { gboolean ret = TRUE; GstStructure *s; GstVpe *self = GST_VPE (gst_pad_get_parent (pad)); if (caps) { GST_OBJECT_LOCK (self); if (TRUE == (ret = gst_vpe_parse_input_caps (self, caps))) { ret = gst_vpe_set_output_caps (self); } GST_OBJECT_UNLOCK (self); if (TRUE == ret) { gst_pad_set_caps (self->srcpad, self->output_caps); } GST_INFO_OBJECT (self, "set caps done %d", ret); } gst_object_unref (self); return ret; } static GstCaps * gst_vpe_getcaps (GstPad * pad) { GstCaps *caps = NULL; caps = gst_pad_get_current_caps (pad); if (caps == NULL) { GstCaps *fil = gst_pad_get_pad_template_caps (pad); return fil; } else { return caps; } } static gboolean gst_vpe_query (GstPad * pad, GstObject * parent, GstQuery * query) { GstVpe *self = GST_VPE (parent); switch (GST_QUERY_TYPE (query)) { case GST_QUERY_CAPS: { GstCaps *caps; caps = gst_vpe_getcaps (pad); gst_query_set_caps_result (query, caps); return TRUE; break; } case GST_QUERY_ALLOCATION: { GstCaps *caps; gst_query_parse_allocation (query, &caps, NULL); if (caps == NULL) return FALSE; GST_OBJECT_LOCK (self); if (G_UNLIKELY (self->state == GST_VPE_ST_DEINIT)) { GST_OBJECT_UNLOCK (self); GST_WARNING_OBJECT (self, "Plugin is shutting down, returning FALSE"); return FALSE; } if (!gst_vpe_init_input_bufs (self, caps)) { GST_OBJECT_UNLOCK (self); return FALSE; } gst_query_add_allocation_pool (query, GST_BUFFER_POOL (self->input_pool), 1, 0, self->num_input_buffers); gst_query_add_allocation_param (query, gst_drm_allocator_get (), NULL); GST_OBJECT_UNLOCK (self); gst_caps_unref (caps); return TRUE; break; } case GST_QUERY_LATENCY: /* TODO: */ break; default: break; } return gst_pad_query_default (pad, parent, query); } static GstFlowReturn gst_vpe_chain (GstPad * pad, GstObject * parent, GstBuffer * buf) { GstVpe *self = GST_VPE (parent); gint q_cnt; GstVPEBufferPriv *vpe_buf; GST_DEBUG_OBJECT (self, "chain: %" GST_TIME_FORMAT " ( ptr %p)", GST_TIME_ARGS (GST_BUFFER_PTS (buf)), buf); GST_OBJECT_LOCK (self); if (G_UNLIKELY (self->state != GST_VPE_ST_ACTIVE && self->state != GST_VPE_ST_STREAMING)) { if (self->state == GST_VPE_ST_DEINIT) { GST_OBJECT_UNLOCK (self); GST_WARNING_OBJECT (self, "Plugin is shutting down, freeing buffer: %p", buf); gst_buffer_unref (buf); return GST_FLOW_OK; } else { if (self->input_crop.c.width == 0) { GstVideoCropMeta *crop = gst_buffer_get_video_crop_meta (buf); if (crop) { self->input_crop.c.left = crop->x; self->input_crop.c.top = crop->y; self->input_crop.c.width = crop->width; self->input_crop.c.height = crop->height; } } if (gst_vpe_start (self, gst_pad_get_current_caps (pad))) { GST_OBJECT_UNLOCK (self); /* Set output caps, this should be done outside the lock */ gst_pad_set_caps (self->srcpad, self->output_caps); GST_OBJECT_LOCK (self); } else { GST_OBJECT_UNLOCK (self); return GST_FLOW_ERROR; } } } if (self->passthrough) { GST_OBJECT_UNLOCK (self); GST_DEBUG_OBJECT (self, "Passthrough for VPE"); return gst_pad_push (self->srcpad, buf); } vpe_buf = gst_buffer_get_vpe_buffer_priv (self->input_pool, buf); if (!vpe_buf) { GST_DEBUG_OBJECT (self, "Importing buffer not allocated by self %p", buf); GstBuffer *in = gst_vpe_buffer_pool_import (self->input_pool, buf); if (in) { vpe_buf = gst_buffer_get_vpe_buffer_priv (self->input_pool, buf); } } if (vpe_buf) { if (G_UNLIKELY (self->state != GST_VPE_ST_STREAMING)) { gst_vpe_set_streaming (self, TRUE); self->state = GST_VPE_ST_STREAMING; } if ((MAX_INPUT_Q_DEPTH - self->input_q_depth) >= 1) { GST_DEBUG_OBJECT (self, "Push the buffer into the V4L2 driver %d", self->input_q_depth); if (TRUE != gst_vpe_buffer_pool_queue (self->input_pool, buf, &q_cnt)) { GST_OBJECT_UNLOCK (self); return GST_FLOW_ERROR; } self->input_q_depth += q_cnt; if (self->interlaced) { self->output_q_processing += q_cnt * 2; } else { self->output_q_processing += q_cnt; } } else { g_queue_push_tail (&self->input_q, (gpointer) buf); } } else { GST_DEBUG_OBJECT (self, "Unref the buffer if not pushed to driver %p", buf); gst_buffer_unref(buf); } GST_OBJECT_UNLOCK (self); /* Allow dequeue thread to run */ sched_yield (); return GST_FLOW_OK; } static gboolean gst_vpe_event (GstPad * pad, GstObject * parent, GstEvent * event) { GstVpe *self = GST_VPE (parent); gboolean ret = TRUE; GST_DEBUG_OBJECT (self, "begin: event=%s", GST_EVENT_TYPE_NAME (event)); switch (GST_EVENT_TYPE (event)) { case GST_EVENT_CAPS: { GstCaps *caps; gst_event_parse_caps (event, &caps); gst_event_unref (event); return gst_vpe_sink_setcaps (pad, caps); break; } case GST_EVENT_SEGMENT: { gst_event_copy_segment (event, &self->segment); if (self->segment.format == GST_FORMAT_TIME && self->segment.rate < (gdouble) 0.0) { GST_OBJECT_LOCK (self); /* In case of reverse playback, more input buffers are required */ if (!gst_vpe_init_input_bufs (self, NULL)) { GST_ERROR_OBJECT (self, "gst_vpe_init_input_bufs failed"); } GST_OBJECT_UNLOCK (self); } } break; case GST_EVENT_EOS: while (1) { GST_OBJECT_LOCK (self); if (TRUE != g_queue_is_empty (&self->input_q)) { GST_DEBUG_OBJECT (self, "Buffers to be pushed into the V4L2 driver %d", self->input_q_depth); } else if (0 != self->output_q_processing) { GST_DEBUG_OBJECT (self, "Buffers to be processed by the V4L2 driver %d", self->output_q_processing); } else { GST_OBJECT_UNLOCK (self); GST_DEBUG_OBJECT (self, "VPE ready for EOS"); break; } GST_OBJECT_UNLOCK (self); usleep (10000); } break; case GST_EVENT_FLUSH_STOP: GST_OBJECT_LOCK (self); self->state = GST_VPE_ST_INIT; GST_OBJECT_UNLOCK (self); break; case GST_EVENT_FLUSH_START: GST_OBJECT_LOCK (self); gst_vpe_set_streaming (self, FALSE); self->state = GST_VPE_ST_DEINIT; GST_OBJECT_UNLOCK (self); break; default: break; } ret = gst_pad_push_event (self->srcpad, event); GST_DEBUG_OBJECT (self, "end ret=%d", ret); return ret; } static gboolean gst_vpe_src_event (GstPad * pad, GstObject * parent, GstEvent * event) { GstVpe *self = GST_VPE (parent); gboolean ret = TRUE; GST_DEBUG_OBJECT (self, "begin: event=%s", GST_EVENT_TYPE_NAME (event)); switch (GST_EVENT_TYPE (event)) { case GST_EVENT_QOS: // TODO or not!! ret = gst_pad_push_event (self->sinkpad, event); break; default: ret = gst_pad_push_event (self->sinkpad, event); break; } GST_DEBUG_OBJECT (self, "end"); return ret; } static GstStateChangeReturn gst_vpe_change_state (GstElement * element, GstStateChange transition) { GstStateChangeReturn ret = GST_STATE_CHANGE_SUCCESS; GstVpe *self = GST_VPE (element); gboolean supported; GST_DEBUG_OBJECT (self, "begin: changing state %s -> %s", gst_element_state_get_name (GST_STATE_TRANSITION_CURRENT (transition)), gst_element_state_get_name (GST_STATE_TRANSITION_NEXT (transition))); switch (transition) { case GST_STATE_CHANGE_READY_TO_PAUSED: GST_OBJECT_LOCK (self); self->state = GST_VPE_ST_INIT; GST_OBJECT_UNLOCK (self); break; default: break; } ret = GST_ELEMENT_CLASS (parent_class)->change_state (element, transition); GST_DEBUG_OBJECT (self, "parent state change returned: %d", ret); if (ret == GST_STATE_CHANGE_FAILURE) goto leave; switch (transition) { case GST_STATE_CHANGE_PAUSED_TO_READY: GST_OBJECT_LOCK (self); gst_vpe_set_streaming (self, FALSE); self->state = GST_VPE_ST_DEINIT; gst_vpe_destroy (self); GST_OBJECT_UNLOCK (self); break; default: break; } leave: GST_DEBUG_OBJECT (self, "end"); return ret; } /* GObject vmethod implementations */ static void gst_vpe_get_property (GObject * obj, guint prop_id, GValue * value, GParamSpec * pspec) { GstVpe *self = GST_VPE (obj); switch (prop_id) { case PROP_NUM_INPUT_BUFFERS: g_value_set_int (value, self->num_input_buffers); break; case PROP_NUM_OUTPUT_BUFFERS: g_value_set_int (value, self->num_output_buffers); break; case PROP_DEVICE: g_value_set_string (value, self->device); break; default: { G_OBJECT_WARN_INVALID_PROPERTY_ID (obj, prop_id, pspec); break; } } } static void gst_vpe_set_property (GObject * obj, guint prop_id, const GValue * value, GParamSpec * pspec) { GstVpe *self = GST_VPE (obj); switch (prop_id) { case PROP_NUM_INPUT_BUFFERS: self->num_input_buffers = g_value_get_int (value); break; case PROP_NUM_OUTPUT_BUFFERS: self->num_output_buffers = g_value_get_int (value); break; case PROP_DEVICE: g_free (self->device); self->device = g_value_dup_string (value); break; default: { G_OBJECT_WARN_INVALID_PROPERTY_ID (obj, prop_id, pspec); break; } } } static void gst_vpe_finalize (GObject * obj) { GstVpe *self = GST_VPE (obj); GST_OBJECT_LOCK (self); gst_vpe_destroy (self); GST_OBJECT_UNLOCK (self); G_OBJECT_CLASS (parent_class)->finalize (obj); } static void gst_vpe_base_init (gpointer gclass) { GstElementClass *element_class = GST_ELEMENT_CLASS (gclass); gst_element_class_set_static_metadata (element_class, "vpe", "Filter/Converter/Video", "Video processing adapter", "Harinarayan Bhatta <harinarayan@ti.com>"); gst_element_class_add_pad_template (element_class, gst_static_pad_template_get (&src_factory)); gst_element_class_add_pad_template (element_class, gst_static_pad_template_get (&sink_factory)); } static void gst_vpe_class_init (GstVpeClass * klass) { GObjectClass *gobject_class = G_OBJECT_CLASS (klass); GstElementClass *gstelement_class = GST_ELEMENT_CLASS (klass); parent_class = g_type_class_peek_parent (klass); gobject_class->get_property = GST_DEBUG_FUNCPTR (gst_vpe_get_property); gobject_class->set_property = GST_DEBUG_FUNCPTR (gst_vpe_set_property); gobject_class->finalize = GST_DEBUG_FUNCPTR (gst_vpe_finalize); gstelement_class->change_state = GST_DEBUG_FUNCPTR (gst_vpe_change_state); g_object_class_install_property (gobject_class, PROP_NUM_INPUT_BUFFERS, g_param_spec_int ("num-input-buffers", "Number of input buffers that are allocated and used by this plugin.", "The number if input buffers allocated should be specified based on " "the upstream element's requirement. For example, if gst-ducati-plugin " "is the upstream element, this value should be based on max-reorder-frames " "property of that element. 0 => decide automatically", 0, MAX_NUM_INBUFS, DEFAULT_NUM_INBUFS, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_NUM_OUTPUT_BUFFERS, g_param_spec_int ("num-output-buffers", "Number of output buffers that are allocated and used by this plugin.", "The number if output buffers allocated should be specified based on " "the downstream element's requirement. It is generally set to the minimum " "value acceptable to the downstream element to reduce memory usage.", 3, MAX_NUM_OUTBUFS, DEFAULT_NUM_OUTBUFS, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_DEVICE, g_param_spec_string ("device", "Device", "Device location", DEFAULT_DEVICE, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); } static void gst_vpe_init (GstVpe * self, gpointer klass) { GstElementClass *gstelement_class = GST_ELEMENT_CLASS (klass); self->sinkpad = gst_pad_new_from_static_template (&sink_factory, "sink"); gst_pad_set_chain_function (self->sinkpad, GST_DEBUG_FUNCPTR (gst_vpe_chain)); gst_pad_set_event_function (self->sinkpad, GST_DEBUG_FUNCPTR (gst_vpe_event)); self->srcpad = gst_pad_new_from_static_template (&src_factory, "src"); gst_pad_set_event_function (self->srcpad, GST_DEBUG_FUNCPTR (gst_vpe_src_event)); gst_pad_set_query_function (self->srcpad, GST_DEBUG_FUNCPTR (gst_vpe_query)); gst_pad_set_query_function (self->sinkpad, GST_DEBUG_FUNCPTR (gst_vpe_query)); gst_pad_set_activatemode_function (self->srcpad, gst_vpe_activate_mode); gst_element_add_pad (GST_ELEMENT (self), self->sinkpad); gst_element_add_pad (GST_ELEMENT (self), self->srcpad); self->input_width = 0; self->input_height = 0; self->input_max_ref_frames = 0; self->input_crop.c.top = 0; self->input_crop.c.left = 0; self->input_crop.c.width = 0; self->input_crop.c.height = 0; self->input_crop.type = V4L2_BUF_TYPE_VIDEO_OUTPUT_MPLANE; self->interlaced = FALSE; self->state = GST_VPE_ST_INIT; self->passthrough = TRUE; self->input_pool = NULL; self->output_pool = NULL; self->dev = NULL; self->video_fd = -1; self->input_caps = NULL; self->output_caps = NULL; self->fixed_caps = FALSE; self->num_input_buffers = DEFAULT_NUM_INBUFS; self->num_output_buffers = DEFAULT_NUM_OUTBUFS; self->output_framerate_d = 0; self->output_repeat_rate = 1; self->device = g_strdup (DEFAULT_DEVICE); g_queue_init (&self->input_q); self->input_q_depth = 0; self->output_q_processing = 0; gst_segment_init (&self->segment, GST_FORMAT_UNDEFINED); } GST_DEBUG_CATEGORY (gst_vpe_debug); #include "gstvpebins.h" static gboolean plugin_init (GstPlugin * plugin) { GST_DEBUG_CATEGORY_INIT (gst_vpe_debug, "vpe", 0, "vpe"); return (gst_element_register (plugin, "vpe", GST_RANK_NONE, GST_TYPE_VPE)) && gst_element_register (plugin, "ducatih264decvpe", GST_RANK_PRIMARY + 2, gst_vpe_ducatih264dec_get_type ()) && gst_element_register (plugin, "ducatimpeg2decvpe", GST_RANK_PRIMARY + 2, gst_vpe_ducatimpeg2dec_get_type ()) && gst_element_register (plugin, "ducatimpeg4decvpe", GST_RANK_PRIMARY + 2, gst_vpe_ducatimpeg4dec_get_type ()) && gst_element_register (plugin, "ducatijpegdecvpe", GST_RANK_PRIMARY + 2, gst_vpe_ducatijpegdec_get_type ()) && gst_element_register (plugin, "ducativc1decvpe", GST_RANK_PRIMARY + 2, gst_vpe_ducativc1dec_get_type ()); } /* PACKAGE: this is usually set by autotools depending on some _INIT macro * in configure.ac and then written into and defined in config.h, but we can * just set it ourselves here in case someone doesn't use autotools to * compile this code. GST_PLUGIN_DEFINE needs PACKAGE to be defined. */ #ifndef PACKAGE #define PACKAGE "vpeplugin" #endif GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, vpeplugin, "Hardware accelerated video porst-processing using TI VPE (V4L2-M2M) driver on DRA7x SoC", plugin_init, VERSION, "LGPL", "GStreamer", "http://gstreamer.net/")
/* * GStreamer * Copyright (c) 2014, Texas Instruments Incorporated * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation * version 2.1 of the License. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA */ #ifdef HAVE_CONFIG_H #include <config.h> #endif #include "gstvpe.h" #include <unistd.h> #define AD_GST_VPE_RGB_FORMAT //Kishan GstVPEBufferPriv * gst_buffer_get_vpe_buffer_priv (GstVpeBufferPool * pool, GstBuffer * buf) { int fd_copy; GstMemory *mem; GstVPEBufferPriv *vpemeta; mem = gst_buffer_peek_memory (buf, 0); fd_copy = gst_fd_memory_get_fd (mem); vpemeta = g_hash_table_lookup (pool->vpebufferpriv, (gpointer) fd_copy); return vpemeta; } GstBuffer * gst_vpe_buffer_new (GstVpeBufferPool * pool, struct omap_device * dev, guint32 fourcc, gint width, gint height, int index, guint32 v4l2_type) { GstVPEBufferPriv *vpemeta; GstVideoCropMeta *crop; int size = 0; GstBuffer *buf; GstAllocator *allocator = gst_drm_allocator_get (); buf = gst_buffer_new (); if (!buf) return NULL; switch (fourcc) { case GST_MAKE_FOURCC ('A', 'R', '2', '4'): size = width * height * 4; break; case GST_MAKE_FOURCC ('Y', 'U', 'Y', '2'): case GST_MAKE_FOURCC ('Y', 'U', 'Y', 'V'): size = width * height * 2; break; #ifdef AD_GST_VPE_RGB_FORMAT case GST_MAKE_FOURCC ('B', 'G', 'R', '3'): case GST_MAKE_FOURCC ('R', 'G', 'B', '3'): case GST_MAKE_FOURCC ('R', 'G', 'B', '\0'): size = width * height * 3; break; #endif case GST_MAKE_FOURCC ('N', 'V', '1', '2'): size = (width * height * 3) / 2; break; } gst_buffer_append_memory (buf, gst_allocator_alloc (allocator, size, NULL)); vpemeta = gst_vpe_buffer_priv (pool, dev, fourcc, width, height, index, v4l2_type, buf); if (!vpemeta) { VPE_ERROR ("Failed to add vpe metadata"); gst_buffer_unref (buf); return NULL; } /* attach dmabuf handle to buffer so that elements from other * plugins can access for zero copy hw accel: */ crop = gst_buffer_add_video_crop_meta (buf); if (!crop) { VPE_DEBUG ("Failed to add crop meta to buffer"); } else { crop->x = 0; crop->y = 0; crop->height = height; crop->width = width; } VPE_DEBUG ("Allocated a new VPE buffer, %dx%d, index: %d, type: %d", width, height, index, v4l2_type); return buf; } GstBuffer * gst_vpe_buffer_ref (GstVpeBufferPool * pool, GstBuffer * in) { GstVPEBufferPriv *vpemeta; GstVideoCropMeta *crop, *incrop; int size; GstBuffer *buf; GstAllocator *allocator = gst_drm_allocator_get (); int fd_copy; GstMemory *mem; buf = gst_buffer_new (); if (!buf) return NULL; mem = gst_buffer_peek_memory (in, 0); fd_copy = gst_fd_memory_get_fd (mem); vpemeta = g_hash_table_lookup (pool->vpebufferpriv, (gpointer) fd_copy); if (!vpemeta) { VPE_ERROR ("Failed to get vpe metadata"); gst_buffer_unref (buf); return NULL; } gst_buffer_append_memory (buf, gst_buffer_get_memory (buf, 0)); /* attach dmabuf handle to buffer so that elements from other * plugins can access for zero copy hw accel: */ incrop = gst_buffer_get_video_crop_meta (in); if (incrop) { crop = gst_buffer_add_video_crop_meta (buf); if (!crop) { VPE_DEBUG ("Failed to add crop meta to buffer"); } else { crop->x = incrop->x; crop->y = incrop->y; crop->height = incrop->height; crop->width = incrop->width; } } return buf; } GstBuffer * gst_vpe_buffer_import (GstVpeBufferPool * pool, struct omap_device * dev, guint32 fourcc, gint width, gint height, int index, guint32 v4l2_type, GstBuffer * buf) { GstVPEBufferPriv *vpemeta; VPE_DEBUG ("Importing buffer"); vpemeta = gst_vpe_buffer_priv (pool, dev, fourcc, width, height, index, v4l2_type, buf); if (!vpemeta) { VPE_ERROR ("Failed to add vpe metadata"); gst_buffer_unref (buf); return NULL; } return buf; } GstVPEBufferPriv * gst_vpe_buffer_priv (GstVpeBufferPool * pool, struct omap_device * dev, guint32 fourcc, gint width, gint height, int index, guint32 v4l2_type, GstBuffer * buf) { GstVPEBufferPriv *vpebuf = g_malloc0 (sizeof (GstVPEBufferPriv)); int fd_copy; GstMemory *mem; if (!vpebuf) goto fail; mem = gst_buffer_peek_memory (buf, 0); fd_copy = gst_fd_memory_get_fd (mem); vpebuf->size = 0; vpebuf->bo = NULL; memset (&vpebuf->v4l2_buf, 0, sizeof (vpebuf->v4l2_buf)); memset (&vpebuf->v4l2_planes, 0, sizeof (vpebuf->v4l2_planes)); vpebuf->v4l2_buf.type = v4l2_type; vpebuf->v4l2_buf.index = index; vpebuf->v4l2_buf.m.planes = vpebuf->v4l2_planes; vpebuf->v4l2_buf.memory = V4L2_MEMORY_DMABUF; switch (fourcc) { case GST_MAKE_FOURCC ('A', 'R', '2', '4'): vpebuf->size = width * height * 4; vpebuf->bo = omap_bo_from_dmabuf (dev, fd_copy); vpebuf->v4l2_buf.length = 1; vpebuf->v4l2_buf.m.planes[0].m.fd = fd_copy; break; case GST_MAKE_FOURCC ('Y', 'U', 'Y', '2'): case GST_MAKE_FOURCC ('Y', 'U', 'Y', 'V'): vpebuf->size = width * height * 2; vpebuf->bo = omap_bo_from_dmabuf (dev, fd_copy); vpebuf->v4l2_buf.length = 1; vpebuf->v4l2_buf.m.planes[0].m.fd = fd_copy; break; #ifdef AD_GST_VPE_RGB_FORMAT case GST_MAKE_FOURCC ('B', 'G', 'R', '3'): case GST_MAKE_FOURCC ('R', 'G', 'B', '3'): case GST_MAKE_FOURCC ('R', 'G', 'B', '\0'): vpebuf->size = width * height * 3; vpebuf->bo = omap_bo_from_dmabuf (dev, fd_copy); vpebuf->v4l2_buf.length = 1; vpebuf->v4l2_buf.m.planes[0].m.fd = fd_copy; break; #endif case GST_MAKE_FOURCC ('N', 'V', '1', '2'): vpebuf->size = (width * height * 3) / 2; vpebuf->bo = omap_bo_from_dmabuf (dev, fd_copy); vpebuf->v4l2_buf.length = 1; vpebuf->v4l2_buf.m.planes[0].m.fd = fd_copy; break; default: VPE_ERROR ("invalid format: 0x%08x", fourcc); goto fail; } g_hash_table_insert (pool->vpebufferpriv, (gpointer) fd_copy, vpebuf); return vpebuf; fail: gst_buffer_unref (buf); return NULL; }
I have modified source file and when execute "" command. I got log as below,which shows that capabilities has been added.But still i could not get output in RGB/BGR/RGB3/BGR3 Format.
Logs:
========
gst-inspect-1.0 vpe
Factory Details:
Rank none (0)
Long-name vpe
Klass Filter/Converter/Video
Description Video processing adapter
Author Harinarayan Bhatta <harinarayan@ti.com>
Plugin Details:
Name vpeplugin
Description Hardware accelerated video porst-processing using TI VPE (V4L2-M2M) driver on DRA7x SoC
Filename /usr/lib/gstreamer-1.0/libgstvpe.so
Version 1.0.0
License LGPL
Source module gst-vpe
Binary package GStreamer
GObject
+----GInitiallyUnowned
+----GstObject
+----GstElement
+----GstVpe
Pad Templates:
SINK template: 'sink'
Availability: Always
Capabilities:
video/x-raw
format: NV12
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
video/x-raw
format: YUYV
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
video/x-raw
format: BGR3
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
video/x-raw
format: RGB3
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
video/x-raw
format: BGR
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
video/x-raw
format: RGB
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
video/x-raw
format: YUY2
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
SRC template: 'src'
Availability: Always
Capabilities:
video/x-raw
format: NV12
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
video/x-raw
format: YUYV
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
video/x-raw
format: BGR3
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
video/x-raw
format: RGB3
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
video/x-raw
format: BGR
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
video/x-raw
format: RGB
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
video/x-raw
format: YUY2
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
Element Flags:
no flags set
Element Implementation:
Has change_state() function: gst_vpe_change_state
Element has no clocking capabilities.
Element has no URI handling capabilities.
Pads:
SINK: 'sink'
Pad Template: 'sink'
SRC: 'src'
Pad Template: 'src'
Element Properties:
name : The name of the object
flags: readable, writable
String. Default: "vpe0"
parent : The parent of the object
flags: readable, writable
Object of type "GstObject"
num-input-buffers : The number if input buffers allocated should be specified based on the upstream element's requirement. For example, if gst-ducati-plugin is the upstream element, this value should be based on max-reorder-frames property of that element. 0 => decide automatically
flags: readable, writable
Integer. Range: 0 - 128 Default: 12
num-output-buffers : The number if output buffers allocated should be specified based on the downstream element's requirement. It is generally set to the minimum value acceptable to the downstream element to reduce memory usage.
flags: readable, writable
Integer. Range: 3 - 16 Default: 6
device : Device location
flags: readable, writable
String. Default: "/dev/v4l/by-path/platform-489d0000.vpe-video-index0"
root@am57xx-evm:~#
I am also attaching modified source files and debug-log when execute command on machine for "NV12" and "RGB".
Setting pipeline to PAUSED ... Pipeline is live and does not need PREROLL ... Progress: (open) Opening Stream Progress: (connect) Connecting to rtsp://888888:888888@192.168.1.10:554/cam/realmonitor?channel=1&subtype=1 --live --fps 25 Progress: (open) Retrieving server options Progress: (open) Retrieving media info Progress: (request) SETUP stream 0 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: latency = 2000 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: ntp-sync = false /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: rfc7273-sync = false /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: ntp-time-source = NTP time based on realtime clock /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: drop-on-latency = false /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: max-rtcp-rtp-time-diff = 1000 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: buffer-mode = Slave receiver to sender clock /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc0: timeout = 5000000000 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc0: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc1: caps = application/x-rtcp Progress: (open) Opened Stream Setting pipeline to PLAYING ... New clock: GstSystemClock /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: buffer-mode = Slave receiver to sender clock Progress: (request) Sending PLAY request Progress: (request) Sending PLAY request /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc0: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706155622, seqnum-base=(uint)5194, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 Progress: (request) Sent PLAY request /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc1.GstPad:src: caps = application/x-rtcp /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager.GstGhostPad:recv_rtcp_sink_0.GstProxyPad:proxypad3: caps = application/x-rtcp /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstFunnel:funnel1.GstFunnelPad:funnelpad1: caps = application/x-rtcp /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager.GstGhostPad:recv_rtcp_sink_0: caps = application/x-rtcp /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc0.GstPad:src: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706155622, seqnum-base=(uint)5194, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager.GstGhostPad:recv_rtp_sink_0.GstProxyPad:proxypad2: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706155622, seqnum-base=(uint)5194, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstFunnel:funnel0.GstFunnelPad:funnelpad0: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706155622, seqnum-base=(uint)5194, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager.GstGhostPad:recv_rtp_sink_0: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706155622, seqnum-base=(uint)5194, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc0: timeout = 0 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstFunnel:funnel0.GstPad:src: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706155622, seqnum-base=(uint)5194, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSession:rtpsession0.GstPad:recv_rtp_src: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706155622, seqnum-base=(uint)5194, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSsrcDemux:rtpssrcdemux0.GstPad:sink: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706155622, seqnum-base=(uint)5194, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSession:rtpsession0.GstPad:recv_rtp_sink: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706155622, seqnum-base=(uint)5194, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:sink: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706155622, seqnum-base=(uint)5194, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:sink: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706155622, seqnum-base=(uint)5194, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)0164001effe100146764001eace8161fb011000003001e000007080401000468ee3cb0, level=(string)3, profile=(string)high /GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)0164001effe100146764001eace8161fb011000003001e000007080401000468ee3cb0, level=(string)3, profile=(string)high /GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706155622, seqnum-base=(uint)5194, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0.GstGhostPad:recv_rtp_src_0_4294945980_96.GstProxyPad:proxypad6: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706155622, seqnum-base=(uint)5194, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager.GstGhostPad:recv_rtp_src_0_4294945980_96.GstProxyPad:proxypad5: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706155622, seqnum-base=(uint)5194, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)3, profile=(string)high, pixel-aspect-ratio=(fraction)1/1, width=(int)352, height=(int)240, framerate=(fraction)0/1, interlace-mode=(string)progressive, parsed=(boolean)true /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)3, profile=(string)high, pixel-aspect-ratio=(fraction)1/1, width=(int)352, height=(int)240, framerate=(fraction)0/1, interlace-mode=(string)progressive, parsed=(boolean)true /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0/GstDucatiH264Dec:decoder.GstPad:src: caps = video/x-raw, format=(string)NV12, width=(int)512, height=(int)336, framerate=(fraction)0/1, pixel-aspect-ratio=(fraction)1/1, drm_mem=(boolean)true /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0/GstVpe:vpe.GstPad:src: caps = video/x-raw, format=(string)NV12, width=(int)800, height=(int)600, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1 /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0.GstGhostPad:src: caps = video/x-raw, format=(string)NV12, width=(int)800, height=(int)600, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1 /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, format=(string)NV12, width=(int)800, height=(int)600, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1 /GstPipeline:pipeline0/GstWaylandSink:waylandsink0.GstPad:sink: caps = video/x-raw, format=(string)NV12, width=(int)800, height=(int)600, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1 /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, format=(string)NV12, width=(int)800, height=(int)600, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1 /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0.GstGhostPad:src.GstProxyPad:proxypad1: caps = video/x-raw, format=(string)NV12, width=(int)800, height=(int)600, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1 /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0/GstVpe:vpe.GstPad:sink: caps = video/x-raw, format=(string)NV12, width=(int)512, height=(int)336, framerate=(fraction)0/1, pixel-aspect-ratio=(fraction)1/1, drm_mem=(boolean)true /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0/GstDucatiH264Dec:decoder.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)3, profile=(string)high, pixel-aspect-ratio=(fraction)1/1, width=(int)352, height=(int)240, framerate=(fraction)0/1, interlace-mode=(string)progressive, parsed=(boolean)true /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0.GstGhostPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)3, profile=(string)high, pixel-aspect-ratio=(fraction)1/1, width=(int)352, height=(int)240, framerate=(fraction)0/1, interlace-mode=(string)progressive, parsed=(boolean)true /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0/GstDucatiH264Dec:decoder.GstPad:src: caps = video/x-raw, format=(string)NV12, width=(int)512, height=(int)336, framerate=(fraction)0/1, pixel-aspect-ratio=(fraction)1/1, drm_mem=(boolean)true, max-ref-frames=(int)18 /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0/GstVpe:vpe.GstPad:src: caps = video/x-raw, format=(string)NV12, width=(int)800, height=(int)600, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1 /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0.GstGhostPad:src: caps = video/x-raw, format=(string)NV12, width=(int)800, height=(int)600, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1 /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, format=(string)NV12, width=(int)800, height=(int)600, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1 /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0.GstGhostPad:src.GstProxyPad:proxypad1: caps = video/x-raw, format=(string)NV12, width=(int)800, height=(int)600, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1 /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0/GstVpe:vpe.GstPad:sink: caps = video/x-raw, format=(string)NV12, width=(int)512, height=(int)336, framerate=(fraction)0/1, pixel-aspect-ratio=(fraction)1/1, drm_mem=(boolean)true, max-ref-frames=(int)18 /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0/GstVpe:vpe.GstPad:src: caps = video/x-raw, format=(string)NV12, width=(int)800, height=(int)600, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1 /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0.GstGhostPad:src: caps = video/x-raw, format=(string)NV12, width=(int)800, height=(int)600, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1 0:00:01.352406180 2360 0x21fdb0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 0/128, 0 /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, format=(string)NV12, width=(int)800, height=(int)600, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1 /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0.GstGhostPad:src.GstProxyPad:proxypad1: caps = video/x-raw, format=(string)NV12, width=(int)800, height=(int)600, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1 0:00:01.363504811 2360 0x21fdb0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 1/128, 1 0:00:01.421508537 2360 0x21fdb0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 2/128, 2 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSession:rtpsession0: stats = application/x-rtp-session-stats, rtx-drop-count=(uint)0, sent-nack-count=(uint)0, recv-nack-count=(uint)0, source-stats=(GValueArray)< "application/x-rtp-source-stats\,\ ssrc\=\(uint\)1270896741\,\ internal\=\(boolean\)true\,\ validated\=\(boolean\)true\,\ received-bye\=\(boolean\)false\,\ is-csrc\=\(boolean\)false\,\ is-sender\=\(boolean\)false\,\ seqnum-base\=\(int\)-1\,\ clock-rate\=\(int\)-1\,\ octets-sent\=\(guint64\)0\,\ packets-sent\=\(guint64\)0\,\ octets-received\=\(guint64\)0\,\ packets-received\=\(guint64\)0\,\ bitrate\=\(guint64\)0\,\ packets-lost\=\(int\)0\,\ jitter\=\(uint\)0\,\ sent-pli-count\=\(uint\)0\,\ recv-pli-count\=\(uint\)0\,\ sent-fir-count\=\(uint\)0\,\ recv-fir-count\=\(uint\)0\,\ sent-nack-count\=\(uint\)0\,\ recv-nack-count\=\(uint\)0\,\ have-sr\=\(boolean\)false\,\ sr-ntptime\=\(guint64\)0\,\ sr-rtptime\=\(uint\)0\,\ sr-octet-count\=\(uint\)0\,\ sr-packet-count\=\(uint\)0\;", "application/x-rtp-source-stats\,\ ssrc\=\(uint\)4294945980\,\ internal\=\(boolean\)false\,\ validated\=\(boolean\)true\,\ received-bye\=\(boolean\)false\,\ is-csrc\=\(boolean\)false\,\ is-sender\=\(boolean\)true\,\ seqnum-base\=\(int\)-1\,\ clock-rate\=\(int\)90000\,\ rtp-from\=\(string\)192.168.1.10:20000\,\ octets-sent\=\(guint64\)0\,\ packets-sent\=\(guint64\)0\,\ octets-received\=\(guint64\)79680\,\ packets-received\=\(guint64\)73\,\ bitrate\=\(guint64\)0\,\ packets-lost\=\(int\)0\,\ jitter\=\(uint\)18\,\ sent-pli-count\=\(uint\)0\,\ recv-pli-count\=\(uint\)0\,\ sent-fir-count\=\(uint\)0\,\ recv-fir-count\=\(uint\)0\,\ sent-nack-count\=\(uint\)0\,\ recv-nack-count\=\(uint\)0\,\ have-sr\=\(boolean\)false\,\ sr-ntptime\=\(guint64\)0\,\ sr-rtptime\=\(uint\)0\,\ sr-octet-count\=\(uint\)0\,\ sr-packet-count\=\(uint\)0\,\ sent-rb\=\(boolean\)true\,\ sent-rb-fractionlost\=\(uint\)0\,\ sent-rb-packetslost\=\(int\)0\,\ sent-rb-exthighestseq\=\(uint\)5267\,\ sent-rb-jitter\=\(uint\)18\,\ sent-rb-lsr\=\(uint\)0\,\ sent-rb-dlsr\=\(uint\)0\,\ have-rb\=\(boolean\)false\,\ rb-fractionlost\=\(uint\)0\,\ rb-packetslost\=\(int\)0\,\ rb-exthighestseq\=\(uint\)0\,\ rb-jitter\=\(uint\)0\,\ rb-lsr\=\(uint\)0\,\ rb-dlsr\=\(uint\)0\,\ rb-round-trip\=\(uint\)0\;" >, rtx-count=(uint)0; /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSession:rtpsession0.GstPad:send_rtcp_src: caps = application/x-rtcp /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager.GstGhostPad:send_rtcp_src_0: caps = application/x-rtcp /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSink:udpsink1.GstPad:sink: caps = application/x-rtcp /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager.GstGhostPad:send_rtcp_src_0.GstProxyPad:proxypad4: caps = application/x-rtcp 0:00:02.881509792 2360 0x21fdb0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 3/128, 0 0:00:02.883508294 2360 0x21fdb0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 4/128, 1 0:00:02.951488101 2360 0x21fdb0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 5/128, 0 0:00:02.953499290 2360 0x21fdb0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 6/128, 1 0:00:03.020488630 2360 0x21fdb0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 7/128, 0 0:00:03.022497543 2360 0x21fdb0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 8/128, 1 0:00:03.081519390 2360 0x21fdb0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 9/128, 0 0:00:03.083506830 2360 0x21fdb0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 10/128, 1 0:00:03.151484848 2360 0x21fdb0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 11/128, 0 0:00:03.153497664 2360 0x21fdb0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 12/128, 1 0:00:03.221484141 2360 0x21fdb0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 13/128, 0 0:00:03.224473435 2360 0x21fdb0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 14/128, 1 0:00:03.269532283 2360 0x21fdb0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 15/128, 0 0:00:03.271498903 2360 0x21fdb0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 16/128, 1 0:00:03.351488590 2360 0x21fdb0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 17/128, 0 0:00:03.353489044 2360 0x21fdb0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 18/128, 1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSession:rtpsession0: stats = application/x-rtp-session-stats, rtx-drop-count=(uint)0, sent-nack-count=(uint)0, recv-nack-count=(uint)0, source-stats=(GValueArray)< "application/x-rtp-source-stats\,\ ssrc\=\(uint\)1270896741\,\ internal\=\(boolean\)true\,\ validated\=\(boolean\)true\,\ received-bye\=\(boolean\)false\,\ is-csrc\=\(boolean\)false\,\ is-sender\=\(boolean\)false\,\ seqnum-base\=\(int\)-1\,\ clock-rate\=\(int\)-1\,\ octets-sent\=\(guint64\)0\,\ packets-sent\=\(guint64\)0\,\ octets-received\=\(guint64\)0\,\ packets-received\=\(guint64\)0\,\ bitrate\=\(guint64\)0\,\ packets-lost\=\(int\)0\,\ jitter\=\(uint\)0\,\ sent-pli-count\=\(uint\)0\,\ recv-pli-count\=\(uint\)0\,\ sent-fir-count\=\(uint\)0\,\ recv-fir-count\=\(uint\)0\,\ sent-nack-count\=\(uint\)0\,\ recv-nack-count\=\(uint\)0\,\ have-sr\=\(boolean\)false\,\ sr-ntptime\=\(guint64\)0\,\ sr-rtptime\=\(uint\)0\,\ sr-octet-count\=\(uint\)0\,\ sr-packet-count\=\(uint\)0\;", "application/x-rtp-source-stats\,\ ssrc\=\(uint\)4294945980\,\ internal\=\(boolean\)false\,\ validated\=\(boolean\)true\,\ received-bye\=\(boolean\)false\,\ is-csrc\=\(boolean\)false\,\ is-sender\=\(boolean\)true\,\ seqnum-base\=\(int\)-1\,\ clock-rate\=\(int\)90000\,\ rtp-from\=\(string\)192.168.1.10:20000\,\ octets-sent\=\(guint64\)0\,\ packets-sent\=\(guint64\)0\,\ octets-received\=\(guint64\)312247\,\ packets-received\=\(guint64\)283\,\ bitrate\=\(guint64\)761454\,\ packets-lost\=\(int\)0\,\ jitter\=\(uint\)17\,\ sent-pli-count\=\(uint\)0\,\ recv-pli-count\=\(uint\)0\,\ sent-fir-count\=\(uint\)0\,\ recv-fir-count\=\(uint\)0\,\ sent-nack-count\=\(uint\)0\,\ recv-nack-count\=\(uint\)0\,\ have-sr\=\(boolean\)false\,\ sr-ntptime\=\(guint64\)0\,\ sr-rtptime\=\(uint\)0\,\ sr-octet-count\=\(uint\)0\,\ sr-packet-count\=\(uint\)0\,\ sent-rb\=\(boolean\)true\,\ sent-rb-fractionlost\=\(uint\)0\,\ sent-rb-packetslost\=\(int\)0\,\ sent-rb-exthighestseq\=\(uint\)5267\,\ sent-rb-jitter\=\(uint\)18\,\ sent-rb-lsr\=\(uint\)0\,\ sent-rb-dlsr\=\(uint\)0\,\ have-rb\=\(boolean\)false\,\ rb-fractionlost\=\(uint\)0\,\ rb-packetslost\=\(int\)0\,\ rb-exthighestseq\=\(uint\)0\,\ rb-jitter\=\(uint\)0\,\ rb-lsr\=\(uint\)0\,\ rb-dlsr\=\(uint\)0\,\ rb-round-trip\=\(uint\)0\;" >, rtx-count=(uint)0; /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstFunnel:funnel1.GstPad:src: caps = application/x-rtcp /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSession:rtpsession0.GstPad:sync_src: caps = application/x-rtcp /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSsrcDemux:rtpssrcdemux0.GstPad:rtcp_sink: caps = application/x-rtcp /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSession:rtpsession0.GstPad:recv_rtcp_sink: caps = application/x-rtcp /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSsrcDemux:rtpssrcdemux0.GstPad:rtcp_src_4294945980: caps = application/x-rtcp, ssrc=(uint)4294945980 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:sink_rtcp: caps = application/x-rtcp, ssrc=(uint)4294945980
Setting pipeline to PAUSED ... Pipeline is live and does not need PREROLL ... Progress: (open) Opening Stream Progress: (connect) Connecting to rtsp://888888:888888@192.168.1.10:554/cam/realmonitor?channel=1&subtype=1 --live --fps 25 Progress: (open) Retrieving server options Progress: (open) Retrieving media info Progress: (request) SETUP stream 0 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: latency = 2000 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: ntp-sync = false /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: rfc7273-sync = false /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: ntp-time-source = NTP time based on realtime clock /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: drop-on-latency = false /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: max-rtcp-rtp-time-diff = 1000 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: buffer-mode = Slave receiver to sender clock /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc1: timeout = 5000000000 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc1: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc2: caps = application/x-rtcp Progress: (open) Opened Stream Setting pipeline to PLAYING ... New clock: GstSystemClock /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager: buffer-mode = Slave receiver to sender clock Progress: (request) Sending PLAY request Progress: (request) Sending PLAY request /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc1: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706601032, seqnum-base=(uint)5607, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 Progress: (request) Sent PLAY request /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc1.GstPad:src: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706601032, seqnum-base=(uint)5607, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager.GstGhostPad:recv_rtp_sink_0.GstProxyPad:proxypad2: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706601032, seqnum-base=(uint)5607, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstFunnel:funnel0.GstFunnelPad:funnelpad0: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706601032, seqnum-base=(uint)5607, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager.GstGhostPad:recv_rtp_sink_0: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706601032, seqnum-base=(uint)5607, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc2.GstPad:src: caps = application/x-rtcp /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager.GstGhostPad:recv_rtcp_sink_0.GstProxyPad:proxypad3: caps = application/x-rtcp /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstFunnel:funnel1.GstFunnelPad:funnelpad1: caps = application/x-rtcp /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager.GstGhostPad:recv_rtcp_sink_0: caps = application/x-rtcp /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc1: timeout = 0 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstFunnel:funnel0.GstPad:src: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706601032, seqnum-base=(uint)5607, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSession:rtpsession0.GstPad:recv_rtp_src: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706601032, seqnum-base=(uint)5607, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSsrcDemux:rtpssrcdemux0.GstPad:sink: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706601032, seqnum-base=(uint)5607, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSession:rtpsession0.GstPad:recv_rtp_sink: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706601032, seqnum-base=(uint)5607, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:sink: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706601032, seqnum-base=(uint)5607, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:sink: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706601032, seqnum-base=(uint)5607, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpPtDemux:rtpptdemux0.GstPad:sink: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706601032, seqnum-base=(uint)5607, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)0164001effe100146764001eace8161fb011000003001e000007080401000468ee3cb0, level=(string)3, profile=(string)high /GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)0164001effe100146764001eace8161fb011000003001e000007080401000468ee3cb0, level=(string)3, profile=(string)high /GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706601032, seqnum-base=(uint)5607, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0.GstGhostPad:recv_rtp_src_0_4294945980_96.GstProxyPad:proxypad6: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706601032, seqnum-base=(uint)5607, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager.GstGhostPad:recv_rtp_src_0_4294945980_96.GstProxyPad:proxypad5: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)64001E, sprop-parameter-sets=(string)"Z2QAHqzoFh+wEQAAAwAeAAAHCAQ\=\,aO48sA\=\=", a-packetization-supported=(string)DH, a-recvonly=(string)"", ssrc=(uint)4294945980, clock-base=(uint)706601032, seqnum-base=(uint)5607, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1 /GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)3, profile=(string)high, pixel-aspect-ratio=(fraction)1/1, width=(int)352, height=(int)240, framerate=(fraction)0/1, interlace-mode=(string)progressive, parsed=(boolean)true /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)3, profile=(string)high, pixel-aspect-ratio=(fraction)1/1, width=(int)352, height=(int)240, framerate=(fraction)0/1, interlace-mode=(string)progressive, parsed=(boolean)true /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0/GstDucatiH264Dec:decoder.GstPad:src: caps = video/x-raw, format=(string)NV12, width=(int)512, height=(int)336, framerate=(fraction)0/1, pixel-aspect-ratio=(fraction)1/1, drm_mem=(boolean)true (gst-launch-1.0:2374): GStreamer-CRITICAL **: gst_caps_get_structure: assertion 'index < GST_CAPS_LEN (caps)' failed (gst-launch-1.0:2374): GStreamer-CRITICAL **: gst_structure_get_string: assertion 'structure != NULL' failed /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0/GstVpe:vpe.GstPad:src: caps = video/x-raw, format=(string)NV12, width=(int)512, height=(int)336, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1 /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0/GstVpe:vpe.GstPad:sink: caps = video/x-raw, format=(string)NV12, width=(int)512, height=(int)336, framerate=(fraction)0/1, pixel-aspect-ratio=(fraction)1/1, drm_mem=(boolean)true /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0/GstDucatiH264Dec:decoder.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)3, profile=(string)high, pixel-aspect-ratio=(fraction)1/1, width=(int)352, height=(int)240, framerate=(fraction)0/1, interlace-mode=(string)progressive, parsed=(boolean)true /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0.GstGhostPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)3, profile=(string)high, pixel-aspect-ratio=(fraction)1/1, width=(int)352, height=(int)240, framerate=(fraction)0/1, interlace-mode=(string)progressive, parsed=(boolean)true /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0/GstDucatiH264Dec:decoder.GstPad:src: caps = video/x-raw, format=(string)NV12, width=(int)512, height=(int)336, framerate=(fraction)0/1, pixel-aspect-ratio=(fraction)1/1, drm_mem=(boolean)true, max-ref-frames=(int)18 (gst-launch-1.0:2374): GStreamer-CRITICAL **: gst_caps_get_structure: assertion 'index < GST_CAPS_LEN (caps)' failed (gst-launch-1.0:2374): GStreamer-CRITICAL **: gst_structure_get_string: assertion 'structure != NULL' failed /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0/GstVpe:vpe.GstPad:src: caps = video/x-raw, format=(string)NV12, width=(int)512, height=(int)336, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1 /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0/GstVpe:vpe.GstPad:sink: caps = video/x-raw, format=(string)NV12, width=(int)512, height=(int)336, framerate=(fraction)0/1, pixel-aspect-ratio=(fraction)1/1, drm_mem=(boolean)true, max-ref-frames=(int)18 (gst-launch-1.0:2374): GStreamer-CRITICAL **: gst_caps_get_structure: assertion 'index < GST_CAPS_LEN (caps)' failed (gst-launch-1.0:2374): GStreamer-CRITICAL **: gst_structure_get_string: assertion 'structure != NULL' failed 0:00:00.913829547 2374 0xb59016f0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 0/128, 0 /GstPipeline:pipeline0/GstDucatiH264decVpe:ducatih264decvpe0/GstVpe:vpe.GstPad:src: caps = video/x-raw, format=(string)NV12, width=(int)352, height=(int)240, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1 0:00:00.933971700 2374 0xb59016f0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 1/128, 0 0:00:00.984985389 2374 0xb59016f0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 2/128, 0 0:00:01.004976101 2374 0xb59016f0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 3/128, 0 0:00:01.042979844 2374 0xb59016f0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 4/128, 0 0:00:01.074989058 2374 0xb59016f0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 5/128, 0 0:00:01.113972533 2374 0xb59016f0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 6/128, 0 0:00:01.143975437 2374 0xb59016f0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 7/128, 0 0:00:01.185971630 2374 0xb59016f0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 8/128, 0 0:00:01.204978544 2374 0xb59016f0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 9/128, 0 0:00:01.243969827 2374 0xb59016f0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 10/128, 0 0:00:01.274980115 2374 0xb59016f0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 11/128, 0 0:00:01.312968243 2374 0xb59016f0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 12/128, 0 0:00:01.334975513 2374 0xb59016f0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 13/128, 0 0:00:01.385974561 2374 0xb59016f0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 14/128, 0 0:00:01.404975294 2374 0xb59016f0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 15/128, 0 0:00:01.442979038 2374 0xb59016f0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 16/128, 0 0:00:01.467025964 2374 0xb59016f0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 17/128, 0 0:00:01.512976054 2374 0xb59016f0 WARN vpe gstvpebufferpool.c:471:gst_vpe_buffer_pool_import: Allocating a new input buffer index: 18/128, 0 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSession:rtpsession0: stats = application/x-rtp-session-stats, rtx-drop-count=(uint)0, sent-nack-count=(uint)0, recv-nack-count=(uint)0, source-stats=(GValueArray)< "application/x-rtp-source-stats\,\ ssrc\=\(uint\)1567166861\,\ internal\=\(boolean\)true\,\ validated\=\(boolean\)true\,\ received-bye\=\(boolean\)false\,\ is-csrc\=\(boolean\)false\,\ is-sender\=\(boolean\)false\,\ seqnum-base\=\(int\)-1\,\ clock-rate\=\(int\)-1\,\ octets-sent\=\(guint64\)0\,\ packets-sent\=\(guint64\)0\,\ octets-received\=\(guint64\)0\,\ packets-received\=\(guint64\)0\,\ bitrate\=\(guint64\)0\,\ packets-lost\=\(int\)0\,\ jitter\=\(uint\)0\,\ sent-pli-count\=\(uint\)0\,\ recv-pli-count\=\(uint\)0\,\ sent-fir-count\=\(uint\)0\,\ recv-fir-count\=\(uint\)0\,\ sent-nack-count\=\(uint\)0\,\ recv-nack-count\=\(uint\)0\,\ have-sr\=\(boolean\)false\,\ sr-ntptime\=\(guint64\)0\,\ sr-rtptime\=\(uint\)0\,\ sr-octet-count\=\(uint\)0\,\ sr-packet-count\=\(uint\)0\;", "application/x-rtp-source-stats\,\ ssrc\=\(uint\)4294945980\,\ internal\=\(boolean\)false\,\ validated\=\(boolean\)true\,\ received-bye\=\(boolean\)false\,\ is-csrc\=\(boolean\)false\,\ is-sender\=\(boolean\)true\,\ seqnum-base\=\(int\)-1\,\ clock-rate\=\(int\)90000\,\ rtp-from\=\(string\)192.168.1.10:20000\,\ octets-sent\=\(guint64\)0\,\ packets-sent\=\(guint64\)0\,\ octets-received\=\(guint64\)206610\,\ packets-received\=\(guint64\)179\,\ bitrate\=\(guint64\)742435\,\ packets-lost\=\(int\)0\,\ jitter\=\(uint\)19\,\ sent-pli-count\=\(uint\)0\,\ recv-pli-count\=\(uint\)0\,\ sent-fir-count\=\(uint\)0\,\ recv-fir-count\=\(uint\)0\,\ sent-nack-count\=\(uint\)0\,\ recv-nack-count\=\(uint\)0\,\ have-sr\=\(boolean\)false\,\ sr-ntptime\=\(guint64\)0\,\ sr-rtptime\=\(uint\)0\,\ sr-octet-count\=\(uint\)0\,\ sr-packet-count\=\(uint\)0\,\ sent-rb\=\(boolean\)true\,\ sent-rb-fractionlost\=\(uint\)0\,\ sent-rb-packetslost\=\(int\)0\,\ sent-rb-exthighestseq\=\(uint\)5786\,\ sent-rb-jitter\=\(uint\)19\,\ sent-rb-lsr\=\(uint\)0\,\ sent-rb-dlsr\=\(uint\)0\,\ have-rb\=\(boolean\)false\,\ rb-fractionlost\=\(uint\)0\,\ rb-packetslost\=\(int\)0\,\ rb-exthighestseq\=\(uint\)0\,\ rb-jitter\=\(uint\)0\,\ rb-lsr\=\(uint\)0\,\ rb-dlsr\=\(uint\)0\,\ rb-round-trip\=\(uint\)0\;" >, rtx-count=(uint)0; /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSession:rtpsession0.GstPad:send_rtcp_src: caps = application/x-rtcp /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager.GstGhostPad:send_rtcp_src_0: caps = application/x-rtcp /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSink:udpsink1.GstPad:sink: caps = application/x-rtcp /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager.GstGhostPad:send_rtcp_src_0.GstProxyPad:proxypad4: caps = application/x-rtcp /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSession:rtpsession0: stats = application/x-rtp-session-stats, rtx-drop-count=(uint)0, sent-nack-count=(uint)0, recv-nack-count=(uint)0, source-stats=(GValueArray)< "application/x-rtp-source-stats\,\ ssrc\=\(uint\)1567166861\,\ internal\=\(boolean\)true\,\ validated\=\(boolean\)true\,\ received-bye\=\(boolean\)false\,\ is-csrc\=\(boolean\)false\,\ is-sender\=\(boolean\)false\,\ seqnum-base\=\(int\)-1\,\ clock-rate\=\(int\)-1\,\ octets-sent\=\(guint64\)0\,\ packets-sent\=\(guint64\)0\,\ octets-received\=\(guint64\)0\,\ packets-received\=\(guint64\)0\,\ bitrate\=\(guint64\)0\,\ packets-lost\=\(int\)0\,\ jitter\=\(uint\)0\,\ sent-pli-count\=\(uint\)0\,\ recv-pli-count\=\(uint\)0\,\ sent-fir-count\=\(uint\)0\,\ recv-fir-count\=\(uint\)0\,\ sent-nack-count\=\(uint\)0\,\ recv-nack-count\=\(uint\)0\,\ have-sr\=\(boolean\)false\,\ sr-ntptime\=\(guint64\)0\,\ sr-rtptime\=\(uint\)0\,\ sr-octet-count\=\(uint\)0\,\ sr-packet-count\=\(uint\)0\;", "application/x-rtp-source-stats\,\ ssrc\=\(uint\)4294945980\,\ internal\=\(boolean\)false\,\ validated\=\(boolean\)true\,\ received-bye\=\(boolean\)false\,\ is-csrc\=\(boolean\)false\,\ is-sender\=\(boolean\)true\,\ seqnum-base\=\(int\)-1\,\ clock-rate\=\(int\)90000\,\ rtp-from\=\(string\)192.168.1.10:20000\,\ octets-sent\=\(guint64\)0\,\ packets-sent\=\(guint64\)0\,\ octets-received\=\(guint64\)443190\,\ packets-received\=\(guint64\)383\,\ bitrate\=\(guint64\)750584\,\ packets-lost\=\(int\)0\,\ jitter\=\(uint\)21\,\ sent-pli-count\=\(uint\)0\,\ recv-pli-count\=\(uint\)0\,\ sent-fir-count\=\(uint\)0\,\ recv-fir-count\=\(uint\)0\,\ sent-nack-count\=\(uint\)0\,\ recv-nack-count\=\(uint\)0\,\ have-sr\=\(boolean\)false\,\ sr-ntptime\=\(guint64\)0\,\ sr-rtptime\=\(uint\)0\,\ sr-octet-count\=\(uint\)0\,\ sr-packet-count\=\(uint\)0\,\ sent-rb\=\(boolean\)true\,\ sent-rb-fractionlost\=\(uint\)0\,\ sent-rb-packetslost\=\(int\)0\,\ sent-rb-exthighestseq\=\(uint\)5786\,\ sent-rb-jitter\=\(uint\)19\,\ sent-rb-lsr\=\(uint\)0\,\ sent-rb-dlsr\=\(uint\)0\,\ have-rb\=\(boolean\)false\,\ rb-fractionlost\=\(uint\)0\,\ rb-packetslost\=\(int\)0\,\ rb-exthighestseq\=\(uint\)0\,\ rb-jitter\=\(uint\)0\,\ rb-lsr\=\(uint\)0\,\ rb-dlsr\=\(uint\)0\,\ rb-round-trip\=\(uint\)0\;" >, rtx-count=(uint)0; /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstFunnel:funnel1.GstPad:src: caps = application/x-rtcp /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSession:rtpsession0.GstPad:sync_src: caps = application/x-rtcp /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSsrcDemux:rtpssrcdemux0.GstPad:rtcp_sink: caps = application/x-rtcp /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSession:rtpsession0.GstPad:recv_rtcp_sink: caps = application/x-rtcp /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSsrcDemux:rtpssrcdemux0.GstPad:rtcp_src_4294945980: caps = application/x-rtcp, ssrc=(uint)4294945980 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:sink_rtcp: caps = application/x-rtcp, ssrc=(uint)4294945980 /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSession:rtpsession0: stats = application/x-rtp-session-stats, rtx-drop-count=(uint)0, sent-nack-count=(uint)0, recv-nack-count=(uint)0, source-stats=(GValueArray)< "application/x-rtp-source-stats\,\ ssrc\=\(uint\)1567166861\,\ internal\=\(boolean\)true\,\ validated\=\(boolean\)true\,\ received-bye\=\(boolean\)false\,\ is-csrc\=\(boolean\)false\,\ is-sender\=\(boolean\)false\,\ seqnum-base\=\(int\)-1\,\ clock-rate\=\(int\)-1\,\ octets-sent\=\(guint64\)0\,\ packets-sent\=\(guint64\)0\,\ octets-received\=\(guint64\)0\,\ packets-received\=\(guint64\)0\,\ bitrate\=\(guint64\)0\,\ packets-lost\=\(int\)0\,\ jitter\=\(uint\)0\,\ sent-pli-count\=\(uint\)0\,\ recv-pli-count\=\(uint\)0\,\ sent-fir-count\=\(uint\)0\,\ recv-fir-count\=\(uint\)0\,\ sent-nack-count\=\(uint\)0\,\ recv-nack-count\=\(uint\)0\,\ have-sr\=\(boolean\)false\,\ sr-ntptime\=\(guint64\)0\,\ sr-rtptime\=\(uint\)0\,\ sr-octet-count\=\(uint\)0\,\ sr-packet-count\=\(uint\)0\;", "application/x-rtp-source-stats\,\ ssrc\=\(uint\)4294945980\,\ internal\=\(boolean\)false\,\ validated\=\(boolean\)true\,\ received-bye\=\(boolean\)false\,\ is-csrc\=\(boolean\)false\,\ is-sender\=\(boolean\)true\,\ seqnum-base\=\(int\)-1\,\ clock-rate\=\(int\)90000\,\ rtp-from\=\(string\)192.168.1.10:20000\,\ rtcp-from\=\(string\)192.168.1.10:20001\,\ octets-sent\=\(guint64\)0\,\ packets-sent\=\(guint64\)0\,\ octets-received\=\(guint64\)732566\,\ packets-received\=\(guint64\)632\,\ bitrate\=\(guint64\)759751\,\ packets-lost\=\(int\)0\,\ jitter\=\(uint\)24\,\ sent-pli-count\=\(uint\)0\,\ recv-pli-count\=\(uint\)0\,\ sent-fir-count\=\(uint\)0\,\ recv-fir-count\=\(uint\)0\,\ sent-nack-count\=\(uint\)0\,\ recv-nack-count\=\(uint\)0\,\ have-sr\=\(boolean\)true\,\ sr-ntptime\=\(guint64\)16086021561525469184\,\ sr-rtptime\=\(uint\)707029252\,\ sr-octet-count\=\(uint\)9509424\,\ sr-packet-count\=\(uint\)8100\,\ sent-rb\=\(boolean\)true\,\ sent-rb-fractionlost\=\(uint\)0\,\ sent-rb-packetslost\=\(int\)0\,\ sent-rb-exthighestseq\=\(uint\)6239\,\ sent-rb-jitter\=\(uint\)24\,\ sent-rb-lsr\=\(uint\)123863040\,\ sent-rb-dlsr\=\(uint\)197675\,\ have-rb\=\(boolean\)false\,\ rb-fractionlost\=\(uint\)0\,\ rb-packetslost\=\(int\)0\,\ rb-exthighestseq\=\(uint\)0\,\ rb-jitter\=\(uint\)0\,\ rb-lsr\=\(uint\)0\,\ rb-dlsr\=\(uint\)0\,\ rb-round-trip\=\(
Regards,
Kishan Patel.
Hello Margarita,
I have modified "grammar.y" file at path "tisdk/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/gstreamer1.0/1.12.2-r0/package/usr/src/debug/gstreamer1.0/1.12.2-r0/gstreamer-1.12.2/gst/parse/grammar.y" in function "static gint
gst_parse_perform_link (link_t *link, graph_t *graph)".
I am attaching that modified file.
Then execute "MACHINE=am57xx-evm bitbake -f -c compile gstreamer1.0".
Then copy 3 libraries (libgstreamer-1.0.so , libgstreamer-1.0.so.0 , libgstreamer-1.0.so.0.1202.0) on am5728_board at path "/usr/lib".
And execute commands as Below. Please,check its log:
Command:
--------------
gst-launch-1.0 --gst-debug=vpe:3 -v rtspsrc location=rtsp://888888:888888@192.168.1.4:554 ! rtph264depay ! h264parse ! ducatih264dec ! vpe ! 'video/x-raw, format=(string)BGR3, width=(int)800, height=(int)600' ! waylandsink
Log:
------
Please,check log file.
Command:
--------------
gst-launch-1.0 --gst-debug=vpe:3 -v rtspsrc location=rtsp://888888:888888@192.168.1.4:554 ! rtph264depay ! h264parse ! ducatih264dec ! vpe ! 'video/x-raw, format=(string)NV12, width=(int)800, height=(int)600' ! waylandsink
Log:
------
Please,check log file.
I have modified many things. Can you guide me that what should be wrong to link "VPE" and "waylandsink"?
Regards,
Kishan Patel.
SRC=rtspsrc0 SINK=rtph264depay0 SRCS=(null) srcs:2 sinks:2 Found_one=0 Success2 SRC=rtph264depay0 SINK=h264parse0 SRCS=(null) srcs:2 sinks:2 Found_one=1 Success1 Caps=0 SRC=h264parse0 SINK=ducatih264dec0 SRCS=(null) srcs:2 sinks:2 Found_one=1 Success1 Caps=0 SRC=ducatih264dec0 SINK=vpe0 SRCS=(null) srcs:2 sinks:2 Found_one=1 Success1 Caps=0 SRC=vpe0 SINK=waylandsink0 SRCS=(null) srcs:2 sinks:2 Found_one=1 Success1 Caps=155968 Setting pipeline to PAUSED ...
SRC=rtspsrc0 SINK=rtph264depay0 SRCS=(null) srcs:2 sinks:2 Found_one=0 Success2 SRC=rtph264depay0 SINK=h264parse0 SRCS=(null) srcs:2 sinks:2 Found_one=1 Success1 Caps=0 SRC=h264parse0 SINK=ducatih264dec0 SRCS=(null) srcs:2 sinks:2 Found_one=1 Success1 Caps=0 SRC=ducatih264dec0 SINK=vpe0 SRCS=(null) srcs:2 sinks:2 Found_one=1 Success1 Caps=0 SRC=vpe0 SINK=waylandsink0 SRCS=(null) srcs:2 sinks:2 Found_one=0 goto error3 WARNING: erroneous pipeline: 2_could not link vpe0 to waylandsink0, waylandsink0 can't handle caps video/x-raw, format=(string)BGR3, width=(int)800, height=(int)600
%{ #include "../gst_private.h" #include <glib-object.h> #include <glib.h> #include <stdio.h> #include <string.h> #include <stdlib.h> #include "../gst-i18n-lib.h" #include "../gstconfig.h" #include "../gstparse.h" #include "../gstinfo.h" #include "../gsterror.h" #include "../gststructure.h" #include "../gsturi.h" #include "../gstutils.h" #include "../gstvalue.h" #include "../gstchildproxy.h" #include "types.h" /* All error messages in this file are user-visible and need to be translated. * Don't start the message with a capital, and don't end them with a period, * as they will be presented inside a sentence/error. */ #define DEBUG_ENABLE #define YYERROR_VERBOSE 1 #define YYENABLE_NLS 0 #ifndef YYLTYPE_IS_TRIVIAL #define YYLTYPE_IS_TRIVIAL 0 #endif /******************************************************************************************* *** Tracing memory leaks *******************************************************************************************/ #ifdef __GST_PARSE_TRACE static guint __strings; static guint __links; static guint __chains; gchar * __gst_parse_strdup (gchar *org) { gchar *ret; __strings++; ret = g_strdup (org); /* g_print ("ALLOCATED STR (%3u): %p %s\n", __strings, ret, ret); */ return ret; } void __gst_parse_strfree (gchar *str) { if (str) { /* g_print ("FREEING STR (%3u): %p %s\n", __strings - 1, str, str); */ g_free (str); g_return_if_fail (__strings > 0); __strings--; } } link_t *__gst_parse_link_new (void) { link_t *ret; __links++; ret = g_slice_new0 (link_t); /* g_print ("ALLOCATED LINK (%3u): %p\n", __links, ret); */ return ret; } void __gst_parse_link_free (link_t *data) { if (data) { /* g_print ("FREEING LINK (%3u): %p\n", __links - 1, data); */ g_slice_free (link_t, data); g_return_if_fail (__links > 0); __links--; } } chain_t * __gst_parse_chain_new (void) { chain_t *ret; __chains++; ret = g_slice_new0 (chain_t); /* g_print ("@%p: ALLOCATED CHAIN (%3u):\n", ret, __chains); */ return ret; } void __gst_parse_chain_free (chain_t *data) { /* g_print ("@%p: FREEING CHAIN (%3u):\n", data, __chains - 1); */ g_slice_free (chain_t, data); g_return_if_fail (__chains > 0); __chains--; } #endif /* __GST_PARSE_TRACE */ /******************************************************************************************* *** define SET_ERROR macro/function *******************************************************************************************/ #ifdef G_HAVE_ISO_VARARGS # define SET_ERROR(error, type, ...) \ G_STMT_START { \ GST_CAT_ERROR (GST_CAT_PIPELINE, __VA_ARGS__); \ if ((error) && !*(error)) { \ g_set_error ((error), GST_PARSE_ERROR, (type), __VA_ARGS__); \ } \ } G_STMT_END #elif defined(G_HAVE_GNUC_VARARGS) # define SET_ERROR(error, type, args...) \ G_STMT_START { \ GST_CAT_ERROR (GST_CAT_PIPELINE, args ); \ if ((error) && !*(error)) { \ g_set_error ((error), GST_PARSE_ERROR, (type), args ); \ } \ } G_STMT_END #else static inline void SET_ERROR (GError **error, gint type, const char *format, ...) { if (error) { if (*error) { g_warning ("error while parsing"); } else { va_list varargs; char *string; va_start (varargs, format); string = g_strdup_vprintf (format, varargs); va_end (varargs); g_set_error (error, GST_PARSE_ERROR, type, string); g_free (string); } } } #endif /* G_HAVE_ISO_VARARGS */ /*** define YYPRINTF macro/function if we're debugging */ /* bison 1.35 calls this macro with side effects, we need to make sure the side effects work - crappy bison */ #ifndef GST_DISABLE_GST_DEBUG # define YYDEBUG 1 # ifdef G_HAVE_ISO_VARARGS /* # define YYFPRINTF(a, ...) GST_CAT_DEBUG (GST_CAT_PIPELINE, __VA_ARGS__) */ # define YYFPRINTF(a, ...) \ G_STMT_START { \ GST_CAT_LOG (GST_CAT_PIPELINE, __VA_ARGS__); \ } G_STMT_END # elif defined(G_HAVE_GNUC_VARARGS) # define YYFPRINTF(a, args...) \ G_STMT_START { \ GST_CAT_LOG (GST_CAT_PIPELINE, args); \ } G_STMT_END # else static inline void YYPRINTF(const char *format, ...) { va_list varargs; gchar *temp; va_start (varargs, format); temp = g_strdup_vprintf (format, varargs); GST_CAT_LOG (GST_CAT_PIPELINE, "%s", temp); g_free (temp); va_end (varargs); } # endif /* G_HAVE_ISO_VARARGS */ #endif /* GST_DISABLE_GST_DEBUG */ /* * include headers generated by bison & flex, after defining (or not defining) YYDEBUG */ #include "grammar.tab.h" #include "parse_lex.h" /******************************************************************************************* *** report missing elements/bins/.. *******************************************************************************************/ static void add_missing_element(graph_t *graph,gchar *name){ if ((graph)->ctx){ (graph)->ctx->missing_elements = g_list_append ((graph)->ctx->missing_elements, g_strdup (name)); } } /******************************************************************************************* *** helpers for pipeline-setup *******************************************************************************************/ #define TRY_SETUP_LINK(l) G_STMT_START { \ if( (!(l)->src.element) && (!(l)->src.name) ){ \ SET_ERROR (graph->error, GST_PARSE_ERROR_LINK, _("link has no source [sink=%s@%p]"), \ (l)->sink.name ? (l)->sink.name : "", \ (l)->sink.element); \ gst_parse_free_link (l); \ }else if( (!(l)->sink.element) && (!(l)->sink.name) ){ \ SET_ERROR (graph->error, GST_PARSE_ERROR_LINK, _("link has no sink [source=%s@%p]"), \ (l)->src.name ? (l)->src.name : "", \ (l)->src.element); \ gst_parse_free_link (l); \ }else{ \ graph->links = g_slist_append (graph->links, l ); \ } \ } G_STMT_END typedef struct { gchar *src_pad; gchar *sink_pad; GstElement *sink; GstCaps *caps; gulong pad_added_signal_id, no_more_pads_signal_id; gboolean all_pads; } DelayedLink; typedef struct { gchar *name; gchar *value_str; gulong signal_id; } DelayedSet; static int gst_resolve_reference(reference_t *rr, GstElement *pipeline){ GstBin *bin; if(rr->element) return 0; /* already resolved! */ if(!rr->name) return -2; /* no chance! */ if (GST_IS_BIN (pipeline)){ bin = GST_BIN (pipeline); rr->element = gst_bin_get_by_name_recurse_up (bin, rr->name); } else { rr->element = strcmp (GST_ELEMENT_NAME (pipeline), rr->name) == 0 ? gst_object_ref(pipeline) : NULL; } if(rr->element) return 0; /* resolved */ else return -1; /* not found */ } static void gst_parse_free_delayed_set (DelayedSet *set) { g_free(set->name); g_free(set->value_str); g_slice_free(DelayedSet, set); } static void gst_parse_new_child(GstChildProxy *child_proxy, GObject *object, const gchar * name, gpointer data); static void gst_parse_add_delayed_set (GstElement *element, gchar *name, gchar *value_str) { DelayedSet *data = g_slice_new0 (DelayedSet); GST_CAT_LOG_OBJECT (GST_CAT_PIPELINE, element, "delaying property set %s to %s", name, value_str); data->name = g_strdup(name); data->value_str = g_strdup(value_str); data->signal_id = g_signal_connect_data(element, "child-added", G_CALLBACK (gst_parse_new_child), data, (GClosureNotify) gst_parse_free_delayed_set, (GConnectFlags) 0); /* FIXME: we would need to listen on all intermediate bins too */ if (GST_IS_BIN (element)) { gchar **names, **current; GstElement *parent, *child; current = names = g_strsplit (name, "::", -1); parent = gst_bin_get_by_name (GST_BIN_CAST (element), current[0]); current++; while (parent && current[0]) { child = gst_bin_get_by_name (GST_BIN (parent), current[0]); if (!child && current[1]) { char *sub_name = g_strjoinv ("::", ¤t[0]); gst_parse_add_delayed_set(parent, sub_name, value_str); g_free (sub_name); } gst_object_unref (parent); parent = child; current++; } if (parent) gst_object_unref (parent); g_strfreev (names); } } static void gst_parse_new_child(GstChildProxy *child_proxy, GObject *object, const gchar * name, gpointer data) { DelayedSet *set = (DelayedSet *) data; GParamSpec *pspec; GValue v = { 0, }; GObject *target = NULL; GType value_type; GST_CAT_LOG_OBJECT (GST_CAT_PIPELINE, child_proxy, "new child %s, checking property %s", name, set->name); if (gst_child_proxy_lookup (child_proxy, set->name, &target, &pspec)) { gboolean got_value = FALSE; value_type = pspec->value_type; GST_CAT_LOG_OBJECT (GST_CAT_PIPELINE, child_proxy, "parsing delayed property %s as a %s from %s", pspec->name, g_type_name (value_type), set->value_str); g_value_init (&v, value_type); if (gst_value_deserialize (&v, set->value_str)) got_value = TRUE; else if (g_type_is_a (value_type, GST_TYPE_ELEMENT)) { GstElement *bin; bin = gst_parse_bin_from_description_full (set->value_str, TRUE, NULL, GST_PARSE_FLAG_NO_SINGLE_ELEMENT_BINS | GST_PARSE_FLAG_PLACE_IN_BIN, NULL); if (bin) { g_value_set_object (&v, bin); got_value = TRUE; } } g_signal_handler_disconnect (child_proxy, set->signal_id); if (!got_value) { #ifdef DEBUG_ENABLE printf("\ngoto error1\n"); #endif goto error; } g_object_set_property (target, pspec->name, &v); } else { const gchar *obj_name = GST_OBJECT_NAME(object); gint len = strlen (obj_name); /* do a delayed set */ if ((strlen (set->name) > (len + 2)) && !strncmp (set->name, obj_name, len) && !strncmp (&set->name[len], "::", 2)) { gst_parse_add_delayed_set (GST_ELEMENT(child_proxy), set->name, set->value_str); } } out: if (G_IS_VALUE (&v)) g_value_unset (&v); if (target) g_object_unref (target); return; error: GST_CAT_ERROR (GST_CAT_PIPELINE, "could not set property \"%s\" in %" GST_PTR_FORMAT, pspec->name, target); goto out; } static void gst_parse_element_set (gchar *value, GstElement *element, graph_t *graph) { GParamSpec *pspec = NULL; gchar *pos = value; GValue v = { 0, }; GObject *target = NULL; GType value_type; /* do nothing if assignment is for missing element */ if (element == NULL) goto out; /* parse the string, so the property name is null-terminated and pos points to the beginning of the value */ while (!g_ascii_isspace (*pos) && (*pos != '=')) pos++; if (*pos == '=') { *pos = '\0'; } else { *pos = '\0'; pos++; while (g_ascii_isspace (*pos)) pos++; } pos++; while (g_ascii_isspace (*pos)) pos++; /* truncate a string if it is delimited with double quotes */ if (*pos == '"' && pos[strlen (pos) - 1] == '"') { pos++; pos[strlen (pos) - 1] = '\0'; } gst_parse_unescape (pos); if (GST_IS_CHILD_PROXY (element)) { if (!gst_child_proxy_lookup (GST_CHILD_PROXY (element), value, &target, &pspec)) { /* do a delayed set */ gst_parse_add_delayed_set (element, value, pos); } } else { pspec = g_object_class_find_property (G_OBJECT_GET_CLASS (element), value); if (pspec != NULL) { target = g_object_ref (element); GST_CAT_LOG_OBJECT (GST_CAT_PIPELINE, target, "found %s property", value); } else { SET_ERROR (graph->error, GST_PARSE_ERROR_NO_SUCH_PROPERTY, \ _("no property \"%s\" in element \"%s\""), value, \ GST_ELEMENT_NAME (element)); } } if (pspec != NULL && target != NULL) { gboolean got_value = FALSE; value_type = pspec->value_type; GST_CAT_LOG_OBJECT (GST_CAT_PIPELINE, element, "parsing property %s as a %s", pspec->name, g_type_name (value_type)); g_value_init (&v, value_type); if (gst_value_deserialize (&v, pos)) got_value = TRUE; else if (g_type_is_a (value_type, GST_TYPE_ELEMENT)) { GstElement *bin; bin = gst_parse_bin_from_description_full (pos, TRUE, NULL, GST_PARSE_FLAG_NO_SINGLE_ELEMENT_BINS | GST_PARSE_FLAG_PLACE_IN_BIN, NULL); if (bin) { g_value_set_object (&v, bin); got_value = TRUE; } } if (!got_value) { #ifdef DEBUG_ENABLE printf("\ngoto error2\n"); #endif goto error; } g_object_set_property (target, pspec->name, &v); } out: gst_parse_strfree (value); if (G_IS_VALUE (&v)) g_value_unset (&v); if (target) g_object_unref (target); return; error: SET_ERROR (graph->error, GST_PARSE_ERROR_COULD_NOT_SET_PROPERTY, _("could not set property \"%s\" in element \"%s\" to \"%s\""), value, GST_ELEMENT_NAME (element), pos); goto out; } static void gst_parse_free_reference (reference_t *rr) { if(rr->element) gst_object_unref(rr->element); gst_parse_strfree (rr->name); g_slist_foreach (rr->pads, (GFunc) gst_parse_strfree, NULL); g_slist_free (rr->pads); } static void gst_parse_free_link (link_t *link) { #ifdef DEBUG_ENABLE //printf("\nsrc=%s\nsink=%s\n",GST_ELEMENT_NAME (src), GST_ELEMENT_NAME (sink)); #endif gst_parse_free_reference (&(link->src)); gst_parse_free_reference (&(link->sink)); if (link->caps) gst_caps_unref (link->caps); gst_parse_link_free (link); } static void gst_parse_free_chain (chain_t *ch) { GSList *walk; gst_parse_free_reference (&(ch->first)); gst_parse_free_reference (&(ch->last)); for(walk=ch->elements;walk;walk=walk->next) gst_object_unref (walk->data); g_slist_free (ch->elements); gst_parse_chain_free (ch); } static void gst_parse_free_delayed_link (DelayedLink *link) { g_free (link->src_pad); g_free (link->sink_pad); if (link->caps) gst_caps_unref (link->caps); g_slice_free (DelayedLink, link); } #define PRETTY_PAD_NAME_FMT "%s %s of %s named %s" #define PRETTY_PAD_NAME_ARGS(elem, pad_name) \ (pad_name ? "pad " : "some"), (pad_name ? pad_name : "pad"), \ G_OBJECT_TYPE_NAME(elem), GST_STR_NULL (GST_ELEMENT_NAME (elem)) static void gst_parse_no_more_pads (GstElement *src, gpointer data) { DelayedLink *link = data; /* Don't warn for all-pads links, as we expect those to * still be active at no-more-pads */ if (!link->all_pads) { GST_ELEMENT_WARNING(src, PARSE, DELAYED_LINK, (_("Delayed linking failed.")), ("failed delayed linking " PRETTY_PAD_NAME_FMT " to " PRETTY_PAD_NAME_FMT, PRETTY_PAD_NAME_ARGS (src, link->src_pad), PRETTY_PAD_NAME_ARGS (link->sink, link->sink_pad))); } /* we keep the handlers connected, so that in case an element still adds a pad * despite no-more-pads, we will consider it for pending delayed links */ } static void gst_parse_found_pad (GstElement *src, GstPad *pad, gpointer data) { DelayedLink *link = data; GST_CAT_INFO (GST_CAT_PIPELINE, "trying delayed linking %s " PRETTY_PAD_NAME_FMT " to " PRETTY_PAD_NAME_FMT, link->all_pads ? "all pads" : "one pad", PRETTY_PAD_NAME_ARGS (src, link->src_pad), PRETTY_PAD_NAME_ARGS (link->sink, link->sink_pad)); if (gst_element_link_pads_filtered (src, link->src_pad, link->sink, link->sink_pad, link->caps)) { /* do this here, we don't want to get any problems later on when * unlocking states */ GST_CAT_DEBUG (GST_CAT_PIPELINE, "delayed linking %s " PRETTY_PAD_NAME_FMT " to " PRETTY_PAD_NAME_FMT " worked", link->all_pads ? "all pads" : "one pad", PRETTY_PAD_NAME_ARGS (src, link->src_pad), PRETTY_PAD_NAME_ARGS (link->sink, link->sink_pad)); g_signal_handler_disconnect (src, link->no_more_pads_signal_id); /* releases 'link' */ if (!link->all_pads) g_signal_handler_disconnect (src, link->pad_added_signal_id); } } /* both padnames and the caps may be NULL */ static gboolean gst_parse_perform_delayed_link (GstElement *src, const gchar *src_pad, GstElement *sink, const gchar *sink_pad, GstCaps *caps, gboolean all_pads) { GList *templs = gst_element_class_get_pad_template_list ( GST_ELEMENT_GET_CLASS (src)); for (; templs; templs = templs->next) { GstPadTemplate *templ = (GstPadTemplate *) templs->data; if ((GST_PAD_TEMPLATE_DIRECTION (templ) == GST_PAD_SRC) && (GST_PAD_TEMPLATE_PRESENCE(templ) == GST_PAD_SOMETIMES)) { DelayedLink *data = g_slice_new (DelayedLink); data->all_pads = all_pads; /* TODO: maybe we should check if src_pad matches this template's names */ GST_CAT_DEBUG (GST_CAT_PIPELINE, "trying delayed link " PRETTY_PAD_NAME_FMT " to " PRETTY_PAD_NAME_FMT, PRETTY_PAD_NAME_ARGS (src, src_pad), PRETTY_PAD_NAME_ARGS (sink, sink_pad)); data->src_pad = g_strdup (src_pad); data->sink = sink; data->sink_pad = g_strdup (sink_pad); if (caps) { data->caps = gst_caps_copy (caps); } else { data->caps = NULL; } data->pad_added_signal_id = g_signal_connect_data (src, "pad-added", G_CALLBACK (gst_parse_found_pad), data, (GClosureNotify) gst_parse_free_delayed_link, (GConnectFlags) 0); data->no_more_pads_signal_id = g_signal_connect (src, "no-more-pads", G_CALLBACK (gst_parse_no_more_pads), data); return TRUE; } } return FALSE; } static gboolean gst_parse_element_can_do_caps (GstElement * e, GstPadDirection dir, GstCaps * link_caps) { gboolean can_do = FALSE, done = FALSE; GstIterator *it; it = (dir == GST_PAD_SRC) ? gst_element_iterate_src_pads (e) : gst_element_iterate_sink_pads (e); while (!done && !can_do) { GValue v = G_VALUE_INIT; GstPad *pad; GstCaps *caps; switch (gst_iterator_next (it, &v)) { case GST_ITERATOR_OK: pad = g_value_get_object (&v); caps = gst_pad_get_current_caps (pad); if (caps == NULL) caps = gst_pad_query_caps (pad, NULL); can_do = gst_caps_can_intersect (caps, link_caps); GST_TRACE ("can_do: %d for %" GST_PTR_FORMAT " and %" GST_PTR_FORMAT, can_do, caps, link_caps); gst_caps_unref (caps); g_value_unset (&v); break; case GST_ITERATOR_DONE: case GST_ITERATOR_ERROR: done = TRUE; break; case GST_ITERATOR_RESYNC: gst_iterator_resync (it); break; } } gst_iterator_free (it); return can_do; } /* * performs a link and frees the struct. src and sink elements must be given * return values 0 - link performed * 1 - link delayed * <0 - error */ static gint gst_parse_perform_link (link_t *link, graph_t *graph) { GstElement *src = link->src.element; GstElement *sink = link->sink.element; GSList *srcs = link->src.pads; GSList *sinks = link->sink.pads; g_assert (GST_IS_ELEMENT (src)); g_assert (GST_IS_ELEMENT (sink)); GST_CAT_INFO (GST_CAT_PIPELINE, "linking " PRETTY_PAD_NAME_FMT " to " PRETTY_PAD_NAME_FMT " (%u/%u) with caps \"%" GST_PTR_FORMAT "\"", PRETTY_PAD_NAME_ARGS (src, link->src.name), PRETTY_PAD_NAME_ARGS (sink, link->sink.name), g_slist_length (srcs), g_slist_length (sinks), link->caps); if (!srcs || !sinks) { #ifdef DEBUG_ENABLE //printf("\nHere\n"); //printf("\nSRC=%d\n",src); //printf("\nSINK=%d\n",sink); printf("\nSRC=%s\nSINK=%s",GST_ELEMENT_NAME (src), GST_ELEMENT_NAME (sink)); printf("\nSRCS=%s\n",srcs); //->data); #endif srcs ? printf("\nsrcs:1\n") : printf("\nsrcs:2\n"); sinks ? printf("\nsinks:1\n") : printf("\nsinks:2\n"); gboolean found_one = gst_element_link_pads_filtered (src, srcs ? (const gchar *) srcs->data : NULL, sink, sinks ? (const gchar *) sinks->data : NULL, link->caps); #ifdef DEBUG_ENABLE printf("\nFound_one=%d\n",found_one); #endif if (found_one) { if (!link->all_pads) { #ifdef DEBUG_ENABLE printf("\nSuccess1\n"); printf("\nCaps=%x\n",link->caps); #endif goto success; /* Linked one, and not an all-pads link = we're done */ } /* Try and link more available pads */ while (gst_element_link_pads_filtered (src, srcs ? (const gchar *) srcs->data : NULL, sink, sinks ? (const gchar *) sinks->data : NULL, link->caps)); } /* We either didn't find any static pads, or this is a all-pads link, * in which case watch for future pads and link those. Not a failure * in the all-pads case if there's no sometimes pads to watch */ if (gst_parse_perform_delayed_link (src, srcs ? (const gchar *) srcs->data : NULL, sink, sinks ? (const gchar *) sinks->data : NULL, link->caps, link->all_pads) || link->all_pads) { #ifdef DEBUG_ENABLE printf("\nSuccess2\n"); #endif goto success; } else { #ifdef DEBUG_ENABLE printf("\ngoto error3\n"); #endif goto error; } } if (g_slist_length (link->src.pads) != g_slist_length (link->sink.pads)) { #ifdef DEBUG_ENABLE printf("\ngoto error4\n"); #endif goto error; } while (srcs && sinks) { const gchar *src_pad = (const gchar *) srcs->data; const gchar *sink_pad = (const gchar *) sinks->data; srcs = g_slist_next (srcs); sinks = g_slist_next (sinks); if (gst_element_link_pads_filtered (src, src_pad, sink, sink_pad, link->caps)) { continue; } else { if (gst_parse_perform_delayed_link (src, src_pad, sink, sink_pad, link->caps, link->all_pads)) { #ifdef DEBUG_ENABLE printf("\nSuccess\n"); #endif continue; } else { #ifdef DEBUG_ENABLE printf("\nNot Success\n"); #endif goto error; } } } success: gst_parse_free_link (link); return 0; error: if (link->caps != NULL) { gboolean src_can_do_caps, sink_can_do_caps; gchar *caps_str = gst_caps_to_string (link->caps); src_can_do_caps = gst_parse_element_can_do_caps (src, GST_PAD_SRC, link->caps); sink_can_do_caps = gst_parse_element_can_do_caps (sink, GST_PAD_SINK, link->caps); if (!src_can_do_caps && sink_can_do_caps) { SET_ERROR (graph->error, GST_PARSE_ERROR_LINK, _("1_could not link %s to %s, %s can't handle caps %s"), GST_ELEMENT_NAME (src), GST_ELEMENT_NAME (sink), GST_ELEMENT_NAME (src), caps_str); } else if (src_can_do_caps && !sink_can_do_caps) { SET_ERROR (graph->error, GST_PARSE_ERROR_LINK, _("2_could not link %s to %s, %s can't handle caps %s"), GST_ELEMENT_NAME (src), GST_ELEMENT_NAME (sink), GST_ELEMENT_NAME (sink), caps_str); } else if (!src_can_do_caps && !sink_can_do_caps) { SET_ERROR (graph->error, GST_PARSE_ERROR_LINK, _("could not link %s to %s, neither element can handle caps %s"), GST_ELEMENT_NAME (src), GST_ELEMENT_NAME (sink), caps_str); } else { SET_ERROR (graph->error, GST_PARSE_ERROR_LINK, _("could not link %s to %s with caps %s"), GST_ELEMENT_NAME (src), GST_ELEMENT_NAME (sink), caps_str); } g_free (caps_str); } else { SET_ERROR (graph->error, GST_PARSE_ERROR_LINK, _("could not link %s to %s"), GST_ELEMENT_NAME (src), GST_ELEMENT_NAME (sink)); } gst_parse_free_link (link); return -1; } static int yyerror (void *scanner, graph_t *graph, const char *s); %} %union { gchar *ss; chain_t *cc; link_t *ll; reference_t rr; GstElement *ee; GSList *pp; graph_t *gg; } /* No grammar ambiguities expected, FAIL otherwise */ %expect 0 %token <ss> PARSE_URL %token <ss> IDENTIFIER %left <ss> REF PADREF BINREF %token <ss> ASSIGNMENT %token <ss> LINK %token <ss> LINK_ALL %type <ss> binopener %type <gg> graph %type <cc> chain bin chainlist openchain elementary %type <rr> reference %type <ll> link %type <ee> element %type <pp> morepads pads assignments %destructor { gst_parse_strfree ($$); } <ss> %destructor { if($$) gst_parse_free_chain($$); } <cc> %destructor { gst_parse_free_link ($$); } <ll> %destructor { gst_parse_free_reference(&($$));} <rr> %destructor { gst_object_unref ($$); } <ee> %destructor { GSList *walk; for(walk=$$;walk;walk=walk->next) gst_parse_strfree (walk->data); g_slist_free ($$); } <pp> %left '(' ')' %left ',' %right '.' %left '!' '=' ':' %lex-param { void *scanner } %parse-param { void *scanner } %parse-param { graph_t *graph } %pure-parser %start graph %% /************************************************************* * Grammar explanation: * _element_s are specified by an identifier of their type. * a name can be give in the optional property-assignments * coffeeelement * fakesrc name=john * identity silence=false name=frodo * (cont'd) **************************************************************/ element: IDENTIFIER { $$ = gst_element_factory_make ($1, NULL); if ($$ == NULL) { add_missing_element(graph, $1); SET_ERROR (graph->error, GST_PARSE_ERROR_NO_SUCH_ELEMENT, _("no element \"%s\""), $1); } gst_parse_strfree ($1); } | element ASSIGNMENT { gst_parse_element_set ($2, $1, graph); $$ = $1; } ; /************************************************************* * Grammar explanation: (cont'd) * a graph has (pure) _element_s, _bin_s and _link_s. * since bins are special elements, bins and elements can * be generalized as _elementary_. * The construction of _bin_s will be discussed later. * (cont'd) * **************************************************************/ elementary: element { $$ = gst_parse_chain_new (); /* g_print ("@%p: CHAINing elementary\n", $$); */ $$->first.element = $1? gst_object_ref($1) : NULL; $$->last.element = $1? gst_object_ref($1) : NULL; $$->first.name = $$->last.name = NULL; $$->first.pads = $$->last.pads = NULL; $$->elements = $1 ? g_slist_prepend (NULL, $1) : NULL; } | bin { $$=$1; } ; /************************************************************* * Grammar explanation: (cont'd) * a _chain_ is a list of _elementary_s that have _link_s inbetween * which are represented through infix-notation. * * fakesrc ! sometransformation ! fakesink * * every _link_ can be augmented with _pads_. * * coffeesrc .sound ! speakersink * multisrc .movie,ads ! .projector,smallscreen multisink * * and every _link_ can be setup to filter media-types * mediasrc ! audio/x-raw, signed=TRUE ! stereosink * * User HINT: * if the lexer does not recognize your media-type it * will make it an element name. that results in errors * like * NO SUCH ELEMENT: no element audio7x-raw * '7' vs. '/' in https://en.wikipedia.org/wiki/QWERTZ * * Parsing HINT: * in the parser we need to differ between chains that can * be extended by more elementaries (_openchain_) and others * that are syntactically closed (handled later in this file). * (e.g. fakesrc ! sinkreferencename.padname) **************************************************************/ chain: openchain { $$=$1; if($$->last.name){ SET_ERROR (graph->error, GST_PARSE_ERROR_SYNTAX, _("unexpected reference \"%s\" - ignoring"), $$->last.name); gst_parse_strfree($$->last.name); $$->last.name=NULL; } if($$->last.pads){ SET_ERROR (graph->error, GST_PARSE_ERROR_SYNTAX, _("unexpected pad-reference \"%s\" - ignoring"), (gchar*)$$->last.pads->data); g_slist_foreach ($$->last.pads, (GFunc) gst_parse_strfree, NULL); g_slist_free ($$->last.pads); $$->last.pads=NULL; } } ; openchain: elementary pads { $$=$1; $$->last.pads = g_slist_concat ($$->last.pads, $2); /* g_print ("@%p@%p: FKI elementary pads\n", $1, $$->last.pads); */ } | openchain link pads elementary pads { $2->src = $1->last; $2->sink = $4->first; $2->sink.pads = g_slist_concat ($3, $2->sink.pads); TRY_SETUP_LINK($2); $4->first = $1->first; $4->elements = g_slist_concat ($1->elements, $4->elements); gst_parse_chain_free($1); $$ = $4; $$->last.pads = g_slist_concat ($$->last.pads, $5); } ; link: LINK { $$ = gst_parse_link_new (); $$->all_pads = FALSE; if ($1) { $$->caps = gst_caps_from_string ($1); if ($$->caps == NULL) SET_ERROR (graph->error, GST_PARSE_ERROR_LINK, _("could not parse caps \"%s\""), $1); gst_parse_strfree ($1); } } | LINK_ALL { $$ = gst_parse_link_new (); $$->all_pads = TRUE; if ($1) { $$->caps = gst_caps_from_string ($1); if ($$->caps == NULL) SET_ERROR (graph->error, GST_PARSE_ERROR_LINK, _("could not parse caps \"%s\""), $1); gst_parse_strfree ($1); } } ; pads: /* NOP */ { $$ = NULL; } | PADREF morepads { $$ = $2; $$ = g_slist_prepend ($$, $1); } ; morepads: /* NOP */ { $$ = NULL; } | ',' IDENTIFIER morepads { $$ = g_slist_prepend ($3, $2); } ; /************************************************************* * Grammar explanation: (cont'd) * the first and last elements of a _chain_ can be give * as URL. This creates special elements that fit the URL. * * fakesrc ! http://fake-sink.org * http://somesource.org ! fakesink **************************************************************/ chain: openchain link PARSE_URL { GstElement *element = gst_element_make_from_uri (GST_URI_SINK, $3, NULL, NULL); /* FIXME: get and parse error properly */ if (!element) { SET_ERROR (graph->error, GST_PARSE_ERROR_NO_SUCH_ELEMENT, _("no sink element for URI \"%s\""), $3); } $$ = $1; $2->sink.element = element?gst_object_ref(element):NULL; $2->src = $1->last; TRY_SETUP_LINK($2); $$->last.element = NULL; $$->last.name = NULL; $$->last.pads = NULL; if(element) $$->elements = g_slist_append ($$->elements, element); g_free ($3); } ; openchain: PARSE_URL { GstElement *element = gst_element_make_from_uri (GST_URI_SRC, $1, NULL, NULL); /* FIXME: get and parse error properly */ if (!element) { SET_ERROR (graph->error, GST_PARSE_ERROR_NO_SUCH_ELEMENT, _("no source element for URI \"%s\""), $1); } $$ = gst_parse_chain_new (); /* g_print ("@%p: CHAINing srcURL\n", $$); */ $$->first.element = NULL; $$->first.name = NULL; $$->first.pads = NULL; $$->last.element = element ? gst_object_ref(element):NULL; $$->last.name = NULL; $$->last.pads = NULL; $$->elements = element ? g_slist_prepend (NULL, element) : NULL; g_free($1); } ; /************************************************************* * Grammar explanation: (cont'd) * the first and last elements of a _chain_ can be linked * to a named _reference_ (with optional pads). * * fakesrc ! nameOfSinkElement. * fakesrc ! nameOfSinkElement.Padname * fakesrc ! nameOfSinkElement.Padname, anotherPad * nameOfSource.Padname ! fakesink **************************************************************/ chain: openchain link reference { $$ = $1; $2->sink= $3; $2->src = $1->last; TRY_SETUP_LINK($2); $$->last.element = NULL; $$->last.name = NULL; $$->last.pads = NULL; } ; openchain: reference { $$ = gst_parse_chain_new (); $$->last=$1; $$->first.element = NULL; $$->first.name = NULL; $$->first.pads = NULL; $$->elements = NULL; } ; reference: REF morepads { gchar *padname = $1; GSList *pads = $2; if (padname) { while (*padname != '.') padname++; *padname = '\0'; padname++; if (*padname != '\0') pads = g_slist_prepend (pads, gst_parse_strdup (padname)); } $$.element=NULL; $$.name=$1; $$.pads=pads; } ; /************************************************************* * Grammar explanation: (cont'd) * a _chainlist_ is just a list of _chain_s. * * You can specify _link_s with named * _reference_ on each side. That * works already after the explanations above. * someSourceName.Pad ! someSinkName. * someSourceName.Pad,anotherPad ! someSinkName.Apad,Bpad * * If a syntax error occurs, the already finished _chain_s * and _links_ are kept intact. *************************************************************/ chainlist: /* NOP */ { $$ = NULL; } | chainlist chain { if ($1){ gst_parse_free_reference(&($1->last)); gst_parse_free_reference(&($2->first)); $2->first = $1->first; $2->elements = g_slist_concat ($1->elements, $2->elements); gst_parse_chain_free ($1); } $$ = $2; } | chainlist error { $$=$1; GST_CAT_DEBUG (GST_CAT_PIPELINE,"trying to recover from syntax error"); SET_ERROR (graph->error, GST_PARSE_ERROR_SYNTAX, _("syntax error")); } ; /************************************************************* * Grammar explanation: (cont'd) * _bins_ *************************************************************/ assignments: /* NOP */ { $$ = NULL; } | ASSIGNMENT assignments { $$ = g_slist_prepend ($2, $1); } ; binopener: '(' { $$ = gst_parse_strdup("bin"); } | BINREF { $$ = $1; } ; bin: binopener assignments chainlist ')' { chain_t *chain = $3; GSList *walk; GstBin *bin = (GstBin *) gst_element_factory_make ($1, NULL); if (!chain) { SET_ERROR (graph->error, GST_PARSE_ERROR_EMPTY_BIN, _("specified empty bin \"%s\", not allowed"), $1); chain = gst_parse_chain_new (); chain->first.element = chain->last.element = NULL; chain->first.name = chain->last.name = NULL; chain->first.pads = chain->last.pads = NULL; chain->elements = NULL; } if (!bin) { add_missing_element(graph, $1); SET_ERROR (graph->error, GST_PARSE_ERROR_NO_SUCH_ELEMENT, _("no bin \"%s\", unpacking elements"), $1); /* clear property-list */ g_slist_foreach ($2, (GFunc) gst_parse_strfree, NULL); g_slist_free ($2); $2 = NULL; } else { for (walk = chain->elements; walk; walk = walk->next ) gst_bin_add (bin, GST_ELEMENT (walk->data)); g_slist_free (chain->elements); chain->elements = g_slist_prepend (NULL, bin); } $$ = chain; /* set the properties now * HINT: property-list cleared above, if bin==NULL */ for (walk = $2; walk; walk = walk->next) gst_parse_element_set ((gchar *) walk->data, GST_ELEMENT (bin), graph); g_slist_free ($2); gst_parse_strfree ($1); } ; /************************************************************* * Grammar explanation: (cont'd) * _graph_ *************************************************************/ graph: chainlist { $$ = graph; $$->chain = $1; if(!$1) { SET_ERROR (graph->error, GST_PARSE_ERROR_EMPTY, _("empty pipeline not allowed")); } } ; %% static int yyerror (void *scanner, graph_t *graph, const char *s) { /* FIXME: This should go into the GError somehow, but how? */ GST_WARNING ("Error during parsing: %s", s); return -1; } GstElement * priv_gst_parse_launch (const gchar *str, GError **error, GstParseContext *ctx, GstParseFlags flags) { graph_t g; gchar *dstr; GSList *walk; GstBin *bin = NULL; GstElement *ret; yyscan_t scanner; g_return_val_if_fail (str != NULL, NULL); g_return_val_if_fail (error == NULL || *error == NULL, NULL); g.chain = NULL; g.links = NULL; g.error = error; g.ctx = ctx; g.flags = flags; #ifdef __GST_PARSE_TRACE GST_CAT_DEBUG (GST_CAT_PIPELINE, "TRACE: tracing enabled"); __strings = __chains = __links = 0; #endif /* __GST_PARSE_TRACE */ /* g_print("Now scanning: %s\n", str); */ dstr = g_strdup (str); priv_gst_parse_yylex_init (&scanner); priv_gst_parse_yy_scan_string (dstr, scanner); #if YYDEBUG yydebug = 1; #endif if (yyparse (scanner, &g) != 0) { SET_ERROR (error, GST_PARSE_ERROR_SYNTAX, "Unrecoverable syntax error while parsing pipeline %s", str); priv_gst_parse_yylex_destroy (scanner); g_free (dstr); goto error1; } priv_gst_parse_yylex_destroy (scanner); g_free (dstr); GST_CAT_DEBUG (GST_CAT_PIPELINE, "got %u elements and %u links", g.chain ? g_slist_length (g.chain->elements) : 0, g_slist_length (g.links)); /* ensure chain is not NULL */ if (!g.chain){ g.chain=gst_parse_chain_new (); g.chain->elements=NULL; g.chain->first.element=NULL; g.chain->first.name=NULL; g.chain->first.pads=NULL; g.chain->last.element=NULL; g.chain->last.name=NULL; g.chain->last.pads=NULL; }; /* ensure elements is not empty */ if(!g.chain->elements){ g.chain->elements= g_slist_prepend (NULL, NULL); }; /* put all elements in our bin if necessary */ if(g.chain->elements->next){ if (flags & GST_PARSE_FLAG_PLACE_IN_BIN) bin = GST_BIN (gst_element_factory_make ("bin", NULL)); else bin = GST_BIN (gst_element_factory_make ("pipeline", NULL)); g_assert (bin); for (walk = g.chain->elements; walk; walk = walk->next) { if (walk->data != NULL) gst_bin_add (bin, GST_ELEMENT (walk->data)); } g_slist_free (g.chain->elements); g.chain->elements = g_slist_prepend (NULL, bin); } ret = (GstElement *) g.chain->elements->data; g_slist_free (g.chain->elements); g.chain->elements=NULL; if (GST_IS_BIN (ret)) bin = GST_BIN (ret); gst_parse_free_chain (g.chain); g.chain = NULL; /* resolve and perform links */ for (walk = g.links; walk; walk = walk->next) { link_t *l = (link_t *) walk->data; int err; err=gst_resolve_reference( &(l->src), ret); if (err) { if(-1==err){ SET_ERROR (error, GST_PARSE_ERROR_NO_SUCH_ELEMENT, "No src-element named \"%s\" - omitting link", l->src.name); }else{ /* probably a missing element which we've handled already */ SET_ERROR (error, GST_PARSE_ERROR_NO_SUCH_ELEMENT, "No src-element found - omitting link"); } gst_parse_free_link (l); continue; } err=gst_resolve_reference( &(l->sink), ret); if (err) { if(-1==err){ SET_ERROR (error, GST_PARSE_ERROR_NO_SUCH_ELEMENT, "No sink-element named \"%s\" - omitting link", l->src.name); }else{ /* probably a missing element which we've handled already */ SET_ERROR (error, GST_PARSE_ERROR_NO_SUCH_ELEMENT, "No sink-element found - omitting link"); } gst_parse_free_link (l); continue; } gst_parse_perform_link (l, &g); } g_slist_free (g.links); out: #ifdef __GST_PARSE_TRACE GST_CAT_DEBUG (GST_CAT_PIPELINE, "TRACE: %u strings, %u chains and %u links left", __strings, __chains, __links); if (__strings || __chains || __links) { g_warning ("TRACE: %u strings, %u chains and %u links left", __strings, __chains, __links); } #endif /* __GST_PARSE_TRACE */ return ret; error1: if (g.chain) { gst_parse_free_chain (g.chain); g.chain=NULL; } g_slist_foreach (g.links, (GFunc)gst_parse_free_link, NULL); g_slist_free (g.links); if (error) g_assert (*error); ret = NULL; goto out; }
Hello Margarita,
Actually, i want to debug errors. So, i have modified "grammar.y" file.
And i have already checked "gst-inspect-1.0 waylandsink" log. Plese, check as below:
root@am57xx-evm:~/Testing# gst-inspect-1.0 waylandsink
Factory Details:
Rank marginal (64)
Long-name wayland video sink
Klass Sink/Video
Description Output to wayland surface
Author Sreerenj Balachandran <sreerenj.balachandran@intel.com>, George Kiagiadakis <george.kiagiadakis@collabora.com>
Plugin Details:
Name waylandsink
Description Wayland Video Sink
Filename /usr/lib/gstreamer-1.0/libgstwaylandsink.so
Version 1.12.2
License LGPL
Source module gst-plugins-bad
Source release date 2017-07-14
Binary package GStreamer Bad Plug-ins source release
Origin URL Unknown package origin
GObject
+----GInitiallyUnowned
+----GstObject
+----GstElement
+----GstBaseSink
+----GstVideoSink
+----GstWaylandSink
Implemented Interfaces:
GstVideoOverlay
GstWaylandVideo
Pad Templates:
SINK template: 'sink'
Availability: Always
Capabilities:
video/x-raw
format: { (string)BGRx, (string)BGRA, (string)RGBx, (string)xBGR, (string)xRGB, (string)RGBA, (string)ABGR, (string)ARGB, (string)RGB, (string)BGR, (string)RGB16, (string)BGR16, (string)YUY2, (string)YVYU, (string)UYVY, (string)AYUV, (string)NV12, (string)NV21, (string)NV16, (string)YUV9, (string)YVU9, (string)Y41B, (string)I420, (string)YV12, (string)Y42B, (string)v308 }
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
video/x-raw(memory:DMABuf)
format: { (string)BGRx, (string)BGRA, (string)RGBx, (string)xBGR, (string)xRGB, (string)RGBA, (string)ABGR, (string)ARGB, (string)RGB, (string)BGR, (string)RGB16, (string)BGR16, (string)YUY2, (string)YVYU, (string)UYVY, (string)AYUV, (string)NV12, (string)NV21, (string)NV16, (string)YUV9, (string)YVU9, (string)Y41B, (string)I420, (string)YV12, (string)Y42B, (string)v308 }
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
Element Flags:
no flags set
Element Implementation:
Has change_state() function: gst_wayland_sink_change_state
Element has no clocking capabilities.
Element has no URI handling capabilities.
Pads:
SINK: 'sink'
Pad Template: 'sink'
Element Properties:
name : The name of the object
flags: readable, writable
String. Default: "waylandsink0"
parent : The parent of the object
flags: readable, writable
Object of type "GstObject"
sync : Sync on the clock
flags: readable, writable
Boolean. Default: true
max-lateness : Maximum number of nanoseconds that a buffer can be late before it is dropped (-1 unlimited)
flags: readable, writable
Integer64. Range: -1 - 9223372036854775807 Default: 20000000
qos : Generate Quality-of-Service events upstream
flags: readable, writable
Boolean. Default: true
async : Go asynchronously to PAUSED
flags: readable, writable
Boolean. Default: true
ts-offset : Timestamp offset in nanoseconds
flags: readable, writable
Integer64. Range: -9223372036854775808 - 9223372036854775807 Default: 0
enable-last-sample : Enable the last-sample property
flags: readable, writable
Boolean. Default: true
last-sample : The last sample received in the sink
flags: readable
Boxed pointer of type "GstSample"
blocksize : Size in bytes to pull per buffer (0 = default)
flags: readable, writable
Unsigned Integer. Range: 0 - 4294967295 Default: 4096
render-delay : Additional render delay of the sink in nanoseconds
flags: readable, writable
Unsigned Integer64. Range: 0 - 18446744073709551615 Default: 0
throttle-time : The time to keep between rendered buffers (0 = disabled)
flags: readable, writable
Unsigned Integer64. Range: 0 - 18446744073709551615 Default: 0
max-bitrate : The maximum bits per second to render (0 = disabled)
flags: readable, writable
Unsigned Integer64. Range: 0 - 18446744073709551615 Default: 0
show-preroll-frame : Whether to render video frames during preroll
flags: readable, writable
Boolean. Default: true
display : Wayland display name to connect to, if not supplied via the GstContext
flags: readable, writable
String. Default: null
use-drm : Wayland Use DRM based memory for allocation
flags: writable
Boolean. Default: false Write only
window-resolution : resolution of video widthxheight
flags: writable
String. Default: "NULL" Write only
root@am57xx-evm:~/Testing#
Regards,
Kishan Patel.
src.zipHello Margarita,
I have to make changes in source code as listed below and i can capable to use below pipelines:
Pipelines:
-------------
1).gst-launch-1.0 --gst-debug=vpe:3 -v rtspsrc location=rtsp://888888:888888@192.168.1.7:554 ! rtph264depay ! h264parse ! ducatih264dec ! vpe ! 'video/x-raw, format=(string)BGR, width=(int)640, height=(int)480' ! filesink location=test.bgr -v
2).gst-launch-1.0 --gst-debug=vpe:3 -v rtspsrc location=rtsp://888888:888888@192.168.1.7:554 ! rtph264depay ! h264parse ! ducatih264dec ! vpe ! 'video/x-raw, format=(string)RGB, width=(int)640, height=(int)480' ! filesink location=test.rgb -v
Please check the modified files for VPE which i am attaching.
Path:(VPE)
--------------
tisdk/build/arago-tmp-external-linaro-toolchain/work/am57xx_evm-linux-gnueabi/gstreamer1.0-plugins-vpe/git-r2.19/git/src
Bitbake:
-----------
MACHINE=am57xx-evm bitbake gstreamer1.0-plugins-vpe --force -c compile
Replace Library:
---------------------
scp tisdk/build/arago-tmp-external-linaro-toolchain/work/am57xx_evm-linux-gnueabi/gstreamer1.0-plugins-vpe/git-r2.19/git/src/.libs/libgstvpe.so root@192.168.1.9:/usr/lib/gstreamer-1.0
Path:(Base)
---------------
tisdk/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/gstreamer1.0-plugins-base/1.12.2-r0/gst-plugins-base-1.12.2/gst-libs/gst/video/video-format.c
Add:
------
case GST_MAKE_FOURCC ('R', 'G', 'B', '\0'):
return GST_VIDEO_FORMAT_RGB;
case GST_MAKE_FOURCC ('B', 'G', 'R', '\0'):
return GST_VIDEO_FORMAT_BGR;
Bitbake:
-----------
MACHINE=am57xx-evm bitbake gstreamer1.0-plugins-base --force -c compile
Replace Library:
--------------------
-scp tisdk/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/gstreamer1.0-plugins-base/1.12.2-r0/build/gst/videoconvert/.libs/libgstvideoconvert.so root@192.168.1.9:/usr/lib/gstreamer-1.0/
-scp tisdk/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/gstreamer1.0-plugins-base/1.12.2-r0/build/gst/typefind/.libs/libgsttypefindfunctions.so root@192.168.1.9:/usr/lib/gstreamer-1.0/
-scp tisdk/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/gstreamer1.0-plugins-base/1.12.2-r0/build/gst/videotestsrc/.libs/libgstvideotestsrc.so root@192.168.1.9:/usr/lib/gstreamer-1.0/
-scp tisdk/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/gstreamer1.0-plugins-base/1.12.2-r0/build/gst/rawparse/.libs/libgstrawparse.so root@192.168.1.9:/usr/lib/gstreamer-1.0/
-scp tisdk/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/gstreamer1.0-plugins-base/1.12.2-r0/build/gst/pbtypes/.libs/libgstpbtypes.so root@192.168.1.9:/usr/lib/gstreamer-1.0/
-scp tisdk/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/gstreamer1.0-plugins-base/1.12.2-r0/build/gst/encoding/.libs/libgstencoding.so root@192.168.1.9:/usr/lib/gstreamer-1.0/
-scp tisdk/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/gstreamer1.0-plugins-base/1.12.2-r0/build/gst/playback/.libs/libgstplayback.so root@192.168.1.9:/usr/lib/gstreamer-1.0/
-scp tisdk/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/gstreamer1.0-plugins-base/1.12.2-r0/build/gst/videoscale/.libs/libgstvideoscale.so root@192.168.1.9:/usr/lib/gstreamer-1.0/
-scp tisdk/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/gstreamer1.0-plugins-base/1.12.2-r0/build/gst/videorate/.libs/libgstvideorate.so root@192.168.1.9:/usr/lib/gstreamer-1.0/
-scp tisdk/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/gstreamer1.0-plugins-base/1.12.2-r0/build/gst-libs/gst/pbutils/.libs/libgstpbutils-1.0.so.0.1202.0 root@192.168.1.9:/usr/lib
-scp tisdk/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/gstreamer1.0-plugins-base/1.12.2-r0/build/gst-libs/gst/video/.libs/libgstvideo-1.0.so.0.1202.0 root@192.168.1.9:/usr/lib
-scp tisdk/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/gstreamer1.0-plugins-base/1.12.2-r0/build/ext/theora/.libs/libgsttheora.so root@192.168.1.9:/usr/lib/gstreamer-1.0/
-scp tisdk/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/gstreamer1.0-plugins-base/1.12.2-r0/build/ext/ogg/.libs/libgstogg.so root@192.168.1.9:/usr/lib/gstreamer-1.0/
-scp tisdk/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/gstreamer1.0-plugins-base/1.12.2-r0/build/ext/pango/.libs/libgstpango.so root@192.168.1.9:/usr/lib/gstreamer-1.0/
Regards,
Kishan Patel.
FILE_PATH: ========== tisdk/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/gstreamer1.0/1.12.2-r0/gstreamer-1.12.2/gst/gstquery.c Function_name(With Debug Modification): ======================================= void gst_query_parse_caps_result (GstQuery * query, GstCaps ** caps) { GstStructure *structure; g_return_if_fail (GST_QUERY_TYPE (query) == GST_QUERY_CAPS); g_return_if_fail (caps != NULL); structure = GST_QUERY_STRUCTURE (query); printf("\n\ngstquery.c:structure:\n%s\n\n",gst_structure_to_string(structure)); *caps = g_value_get_boxed (gst_structure_id_get_value (structure, GST_QUARK (CAPS))); } Pipeline: ========== gst-launch-1.0 -v rtspsrc location=rtsp://888888:888888@192.168.1.6:554 ! rtph264depay ! h264parse ! ducatih264dec ! vpe ! 'video/x-raw, format=(string)YUY2, width=(int)640, height=(int)480' ! waylandsink Debug_Issue: ============ gstquery.c:structure: GstQueryCaps, filter=(GstCaps)"video/x-raw\,\ format\=\(string\)NV12\,\ width\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ height\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ framerate\=\(fraction\)\[\ 0/1\,\ 2147483647/1\ \]\;\ video/x-raw\,\ format\=\(string\)YUYV\,\ width\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ height\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ framerate\=\(fraction\)\[\ 0/1\,\ 2147483647/1\ \]\;\ video/x-raw\,\ format\=\(string\)BGR3\,\ width\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ height\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ framerate\=\(fraction\)\[\ 0/1\,\ 2147483647/1\ \]\;\ video/x-raw\,\ format\=\(string\)RGB3\,\ width\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ height\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ framerate\=\(fraction\)\[\ 0/1\,\ 2147483647/1\ \]\;\ video/x-raw\,\ format\=\(string\)BGR\,\ width\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ height\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ framerate\=\(fraction\)\[\ 0/1\,\ 2147483647/1\ \]\;\ video/x-raw\,\ format\=\(string\)RGB\,\ width\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ height\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ framerate\=\(fraction\)\[\ 0/1\,\ 2147483647/1\ \]\;\ video/x-raw\,\ format\=\(string\)YUY2\,\ width\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ height\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ framerate\=\(fraction\)\[\ 0/1\,\ 2147483647/1\ \]", caps=(GstCaps)"video/x-raw\,\ format\=\(string\)YUY2\,\ width\=\(int\)640\,\ height\=\(int\)480\,\ framerate\=\(fraction\)\[\ 0/1\,\ 2147483647/1\ \]"; Comments: ========== -In last of debug print, we can see that caps=(GstCaps)"video/x-raw\,\ format\=\(string\)YUY2\,\ width\=\(int\)640\,\ height\=\(int\)480\,\ framerate\=\(fraction\)\[\ 0/1\,\ 2147483647/1\ \]"; Pipeline: ========== gstquery.c:structure: gst-launch-1.0 -v rtspsrc location=rtsp://888888:888888@192.168.1.6:554 ! rtph264depay ! h264parse ! ducatih264dec ! vpe ! 'video/x-raw, format=(string)RGB, width=(int)640, height=(int)480' ! waylandsink Debug_Issue: ============ GstQueryCaps, filter=(GstCaps)"video/x-raw\,\ format\=\(string\)NV12\,\ width\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ height\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ framerate\=\(fraction\)\[\ 0/1\,\ 2147483647/1\ \]\;\ video/x-raw\,\ format\=\(string\)YUYV\,\ width\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ height\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ framerate\=\(fraction\)\[\ 0/1\,\ 2147483647/1\ \]\;\ video/x-raw\,\ format\=\(string\)BGR3\,\ width\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ height\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ framerate\=\(fraction\)\[\ 0/1\,\ 2147483647/1\ \]\;\ video/x-raw\,\ format\=\(string\)RGB3\,\ width\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ height\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ framerate\=\(fraction\)\[\ 0/1\,\ 2147483647/1\ \]\;\ video/x-raw\,\ format\=\(string\)BGR\,\ width\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ height\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ framerate\=\(fraction\)\[\ 0/1\,\ 2147483647/1\ \]\;\ video/x-raw\,\ format\=\(string\)RGB\,\ width\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ height\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ framerate\=\(fraction\)\[\ 0/1\,\ 2147483647/1\ \]\;\ video/x-raw\,\ format\=\(string\)YUY2\,\ width\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ height\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ framerate\=\(fraction\)\[\ 0/1\,\ 2147483647/1\ \]", caps=(GstCaps)EMPTY; Comments: ========== -In last of debug print, we can see that caps=(GstCaps)EMPTY;
I am attaching one file which can provide details of current issue with including debug_print, file_path, Modified function's definition, all i have mention in this file.
Please, check this file and if you can know what should i add to solve this issue,Please tell me.
Hello,
I am sorry I was OoO.
kishan patel14 said:2).WARNING: erroneous pipeline: could not link videoconvert0 to waylandsink0, videoconvert0 can't handle caps video/x-raw, format=(string)BGR, width=(int)640, height=(int)480
I am sorry it is normal this gst-launch-1.0 -v videotestsrc ! video/x-raw, format=NV12, width=1280, height=720 ! videoconvert ! 'video/x-raw, format=(string)BGR, width=(int)640, height=(int)480' ! waylandsink
to fails with this error since the resolution is different on the videoconvert input and output. You must add videoscale or change the resolution to be the same like this:
gst-launch-1.0 -v videotestsrc ! video/x-raw, format=NV12, width=1280, height=720 ! videoconvert ! 'video/x-raw, format=(string)BGR, width=(int)1280, height=(int)720' ! fakesink
Please try to replace waylandsink with fakesink
gst-launch-1.0 -v videotestsrc ! video/x-raw, format=NV12, width=1280, height=720 ! videoconvert ! 'video/x-raw, format=(string)BGR, width=(int)1280, height=(int)720' ! fakesink
If this is working, please recheck the changes what you have made for waylandsink.
Please use default video-format.c not the file with your changes.
BR
Margarita
Hello,
Kishan try to open this file in opencv. Resolution is 640x480.
Remove txt at the end.
BR
Margarita