This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TDA4VH-Q1: Gstreamer Compositor Hardware Accelerator

Part Number: TDA4VH-Q1

Tool/software:

Hi,

I am trying to mix multiple videos using gstreamer compositor, when i combine fullHD videos using gstreamer, CPU utilization is increased. Please suggest TDA4AH has any hardware accelerator for gstreamer video mixing. o

I am using "gst-launch-1.0" command with "compositor" to mix multiple "appsrc" elements.

  • Hi Suresh,

    You can use the tiovxmosaic plugin to combine multiple videos into one stream.

    https://github.com/TexasInstruments/edgeai-gst-plugins/wiki/tiovxmosaic 

    Best,
    Jared

  • Hi Jared McArthur,

    Thanks for replying. 

    i want to combine multiple appsrc videos and i am following a example code available in vision_apps/apps/basic_demos/app_multi_cam_codec to save a videofile usin h264 encoder. 

    This example uses gstreamer wrapper functions appCodecInit(), appCodecSrcInit(),appCodecStart(),appCodecDeqAppSrc(),appCodecEnqAppSrc() and appCodecEnqEosAppSrc() to create a h264 encoded video output from appsrc Source.

    i want to use gstreamer compositor/tiovxmosaic in gstreamer pipeline with Gstreamer Wrapper Functions to create a combined videooutput of multiple video sources.

    the command i used to mix one appsrc and videotest pattern with appCodecInit() is as below.

    appsrc name=myAppSrc0 format=3 is-live=true do-timestamp=true  ! videoconvert ! video/x-raw, width=1920, height=1080 ! queue ! mix.sink_0  videotestsrc pattern=smpte ! video/x-raw,width=640,height=640 ! mix.sink_1 tiovxmosaic name=mix sink_0::xpos=0 sink_1::xpos=1920 ! videoconvert ! v4l2h264enc extra-controls=\"controls, frame_level_rate_control_enable=1, video_bitrate=10000000\"  !  h264parse ! mp4mux ! filesink location=output_video.mp4

    This command is creating a mp4 file with 0 byte .

    and when i use the below command with same flow, i am getting a mp4 file with appsrc camera images with it.

    appsrc name=myAppSrc0 format=3 is-live=true do-timestamp=true  ! videoconvert ! video/x-raw, width=1920, height=1080 ! queue ! videoconvert ! v4l2h264enc extra-controls=\"controls, frame_level_rate_control_enable=1, video_bitrate=10000000\"  !  h264parse ! mp4mux ! filesink location=output_video.mp4

    Could you support to get a command for using tiovxmosaic with appsrc source elements?

    Regards,

    Suresh K

  • Hi Suresh,

    I have used tiovxmosaic with an appsrc before without issue. It is currently supported.

    What is the resolution that your are outputting?

    Can you test outputting to kmssink for now to test?

    Best,
    Jared

  • Hi Jared McArthur,

    I am trying to create a file of  width 3840 and height 2160.  This is resolution of combined images.

    Individual file resolution is 1920 x 1080.

    Regards,

    Suresh K

  • Hi Suresh,

    video/x-raw,width=640,height=640

    It looks like your videotestsrc is the wrong size.

    Can you add a caps filter after the tiovxmosaic?

    video/x-raw, format=NV12, width=3840, height=2160

    Can you test outputting to kmssink for now to test?

    Can you try this? It may help with debugging if you can see what you are outputting/displaying.

    Best,
    Jared

  • Thanks Jared McArthur,

    i am able to mosaic video streams using tiovxmosaic.

    and the command is as below.

     

    appsrc name=myAppSrc0 format=GST_FORMAT_TIME is-live=true do-timestamp=true  block=true ! queue! videoconvert ! video/x-raw, width=1280, height=720 ! comp.sink_0 appsrc name=myAppSrc1 format=GST_FORMAT_TIME is-live=true do-timestamp=true  block=true ! queue! videoconvert ! video/x-raw, width=1280, height=720 ! comp.sink_1 appsrc name=myAppSrc2 format=GST_FORMAT_TIME is-live=true do-timestamp=true  block=true ! queue! videoconvert ! video/x-raw, width=1280, height=720 ! comp.sink_2 appsrc name=myAppSrc3 format=GST_FORMAT_TIME is-live=true do-timestamp=true  block=true ! queue! videoconvert ! video/x-raw, width=1280, height=720 ! comp.sink_3 appsrc name=myAppSrc4 format=GST_FORMAT_TIME is-live=true do-timestamp=true  block=true ! queue! videoconvert ! video/x-raw, width=1280, height=720 ! comp.sink_4 appsrc name=myAppSrc5 format=GST_FORMAT_TIME is-live=true do-timestamp=true  block=true ! queue! videoconvert ! video/x-raw, width=1280, height=720 ! comp.sink_5 appsrc name=myAppSrc6 format=GST_FORMAT_TIME is-live=true do-timestamp=true  block=true ! queue! videoconvert ! video/x-raw, width=1280, height=720 ! comp.sink_6 tiovxmosaic name=comp sink_0::startx="<0>" sink_0::starty="<0>" sink_1::startx="<1280>" sink_1::starty="<0>" sink_2::startx="<2560>" sink_2::starty="<0>" sink_3::startx="<0>" sink_3::starty="<720>" sink_4::startx="<1280>" sink_4::starty="<720>" sink_5::startx="<2560>" sink_5::starty="<720>" ! videoconvert ! v4l2h264enc extra-controls="controls, frame_level_rate_control_enable=1, video_bitrate=10000000" ! h264parse ! mp4mux ! filesink location=output_file_1.mp4

    all appsrc inputs are having same resolution of 1280x720. i want to combine videos from different resolution, i tried after changing the caps with different resolution,but it is not generating output file.

    Please suggest,whether i can use tiovxmosaic for combine videos with different resolutions?

    Regads,

    Suresh K

     

  • Hi Suresh,

    You can use different resolutions, but it cannot upscale. It can only downscale.

    Best,
    Jared

  • Hi Jared McArthur,

    Thanks for the reply. i want to combine multiple videos with different resolutions for appsrc inputs,  and followed gst_wrapper.c file available in RTOS SDK 11_00_00_06.  in this , input width and height are configured for only one channel. for different resolutions for individual channels of appsrc, i have updated the function appGstSrcInit in gst_wrapper c  as below.  updated in_height and in_width as an array element to hold different values for each channels.

    Application code is updated with the gst command as below.

    appsrc name=myAppSrc0 format=GST_FORMAT_TIME is-live=true do-timestamp=true  block=true ! queue! videoconvert ! video/x-raw, width=1280, height=720 ! comp.sink_0 appsrc name=myAppSrc1 format=GST_FORMAT_TIME is-live=true do-timestamp=true  block=true ! queue! videoconvert ! video/x-raw, width=640, height=720 ! comp.sink_1 appsrc name=myAppSrc2 format=GST_FORMAT_TIME is-live=true do-timestamp=true  block=true ! queue! videoconvert ! video/x-raw, width=1280, height=720 ! comp.sink_2 appsrc name=myAppSrc3 format=GST_FORMAT_TIME is-live=true do-timestamp=true  block=true ! queue! videoconvert ! video/x-raw, width=640, height=720 ! comp.sink_3 appsrc name=myAppSrc4 format=GST_FORMAT_TIME is-live=true do-timestamp=true  block=true ! queue! videoconvert ! video/x-raw, width=1280, height=720 ! comp.sink_4 appsrc name=myAppSrc5 format=GST_FORMAT_TIME is-live=true do-timestamp=true  block=true ! queue! videoconvert ! video/x-raw, width=1280, height=720 ! comp.sink_5 appsrc name=myAppSrc6 format=GST_FORMAT_TIME is-live=true do-timestamp=true  block=true ! queue! videoconvert ! video/x-raw, width=1280, height=720 ! comp.sink_6 tiovxmosaic name=comp sink_0::startx="<0>" sink_0::starty="<0>" sink_1::startx="<1280>" sink_1::starty="<0>" sink_2::startx="<2560>" sink_2::starty="<0>" sink_3::startx="<0>" sink_3::starty="<720>" sink_4::startx="<1280>" sink_4::starty="<720>" sink_5::startx="<2560>" sink_5::starty="<720>" ! videoconvert ! v4l2h264enc extra-controls="controls, frame_level_rate_control_enable=1, video_bitrate=10000000" ! h264parse ! mp4mux ! filesink location=output_file.mp4

     

    The above command is stopped at second iteration of Enqueuing the dataptr idx values, but same command is working fine and generating a output file, when i configured all width and height as 1280 and 720 respectively.

    Please suggest me whether i need to update  wrapper functions in gst-wrapper.c file to take different resolution inputs?

    Also, please suggest how to check h264 encoder hardware load in TDA4.

    Regards,

    Suresh K

    /*
     *  Copyright (C) 2021 Texas Instruments Incorporated - http://www.ti.com/
     *
     *  Redistribution and use in source and binary forms, with or without
     *  modification, are permitted provided that the following conditions
     *  are met:
     *
     *    Redistributions of source code must retain the above copyright
     *    notice, this list of conditions and the following disclaimer.
     *
     *    Redistributions in binary form must reproduce the above copyright
     *    notice, this list of conditions and the following disclaimer in the
     *    documentation and/or other materials provided with the
     *    distribution.
     *
     *    Neither the name of Texas Instruments Incorporated nor the names of
     *    its contributors may be used to endorse or promote products derived
     *    from this software without specific prior written permission.
     *
     *  THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
     *  "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
     *  LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
     *  A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
     *  OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
     *  SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
     *  LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
     *  DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
     *  THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
     *  (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
     *  OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
     */
    #define WIDTH 320
    #define HEIGHT 240
    #define FPS 30
    
    #include "gst_wrapper_priv.h"
    
    #define GST_TIMEOUT 400 * GST_MSECOND
    
    app_gst_wrapper_obj_t g_app_gst_wrapper_obj;
    
    static GstElement *findElementByName(GstElement *pipeline, const char *name)
    {
        GstElement *elem;
    
        elem = gst_bin_get_by_name(GST_BIN(pipeline), name);
    
        if (elem == NULL)
        {
            printf("gst_wrapper: Could not find element <%s> in the pipeline.\n", name);
        }
    
        return elem;
    }
    
    static int32_t exportgsttiovxbuffer(GstBuffer *buf, void *data_ptr[CODEC_MAX_NUM_PLANES])
    {
        vx_status status = VX_SUCCESS;
        void *p_status = NULL;
        GstTIOVXImageMeta *tiovxmeta = NULL;
        vx_reference img1;
        vx_enum img1_type = VX_TYPE_INVALID;
        uint32_t img1_num_planes = 0;
        uint32_t sizes[CODEC_MAX_NUM_PLANES] = {0};
    
        tiovxmeta = (GstTIOVXImageMeta *)gst_buffer_iterate_meta(buf, &p_status);
        if (!tiovxmeta)
        {
            printf("gst_wrapper: ERROR: TIOVX meta not found in pulled buffer!\n");
            return -1;
        }
    
        img1 = vxGetObjectArrayItem(tiovxmeta->array, 0);
        status = vxGetStatus((vx_reference)img1);
        if (status != VX_SUCCESS)
        {
            printf("gst_wrapper: ERROR: Could not get vx_reference from TIOVX meta!\n");
            return status;
        }
    
        status = vxQueryReference((vx_reference)img1, (vx_enum)VX_REFERENCE_TYPE, &img1_type, sizeof(vx_enum));
        if (VX_SUCCESS != status)
        {
            printf("gst_wrapper: ERROR: Failed to verify VX_REFERENCE_TYPE!\n");
            vxReleaseReference(&img1);
            return status;
        }
        else if (VX_TYPE_IMAGE != img1_type)
        {
            printf("gst_wrapper: ERROR: vx_reference is not a vx_image!\n");
            vxReleaseReference(&img1);
            return VX_ERROR_INVALID_TYPE;
        }
    
        status = tivxReferenceExportHandle(
            (vx_reference)img1,
            data_ptr,
            sizes,
            CODEC_MAX_NUM_PLANES,
            &img1_num_planes);
        if (VX_SUCCESS != status)
        {
            printf("gst_wrapper: ERROR: Could not export handles from vx_image!\n");
            vxReleaseReference(&img1);
            return status;
        }
    
        vxReleaseReference(&img1);
        return status;
    }
    
    int32_t appGstInit(app_codec_wrapper_params_t *params)
    {
        int32_t status = 0;
        app_gst_wrapper_obj_t *p_gst_pipe_obj = &g_app_gst_wrapper_obj;
    
        if (strcmp(params->in_format, "NV12") || strcmp(params->out_format, "NV12"))
        {
            printf("gst_wrapper: Error: Only NV12 format supported!\n");
            status = -1;
        }
    
        if (status == 0 && (sizeof(params->m_cmdString) != 0))
        {
            printf("gst_wrapper: GstCmdString:\n%s\n", params->m_cmdString);
            p_gst_pipe_obj->params = *params;
        }
        else
            status = -1;
    
        p_gst_pipe_obj->m_pipeline = NULL;
        if (status == 0)
        {
            /* GStreamer INIT  */
            gst_init(NULL, NULL);
    
            p_gst_pipe_obj->m_pipeline = gst_parse_launch(p_gst_pipe_obj->params.m_cmdString, NULL);
            if (p_gst_pipe_obj->m_pipeline == NULL)
            {
                printf("gst_wrapper: gst_parse_launch() failed:\n%s\n", p_gst_pipe_obj->params.m_cmdString);
                status = -1;
            }
        }
    
        p_gst_pipe_obj->isAppSrc = 0;
        p_gst_pipe_obj->isAppSink = 0;
        p_gst_pipe_obj->push_count = -1;
        p_gst_pipe_obj->pull_count = -1;
    
        return status;
    }
    
    int32_t appGstSrcInit(void *data_ptr[CODEC_MAX_BUFFER_DEPTH][CODEC_MAX_NUM_CHANNELS][CODEC_MAX_NUM_PLANES])
    {
        int32_t status = 0;
        GstCaps *caps = NULL;
        app_gst_wrapper_obj_t *p_gst_pipe_obj = &g_app_gst_wrapper_obj;
    
    #ifndef MOSAIC_PARAM
        uint32_t plane_size;
        plane_size = p_gst_pipe_obj->params.in_width * p_gst_pipe_obj->params.in_height;
        printf("GST Wrapper: Width and Height  is %d\t %d  \n",  p_gst_pipe_obj->params.in_width, p_gst_pipe_obj->params.in_height);
    #endif
    #ifdef MOSAIC_PARAM
        uint32_t plane_size[CODEC_MAX_NUM_CHANNELS];
    #endif
        /* Setup AppSrc Elements */
        for (uint8_t ch = 0; ch < p_gst_pipe_obj->params.in_num_channels; ch++)
        {
    #ifdef MOSAIC_PARAM
            plane_size[ch] = p_gst_pipe_obj->params.in_width[ch] * p_gst_pipe_obj->params.in_height[ch];
            printf("GST Wrapper: Width and Height of %d Channel is %d\t %d  \n", ch, p_gst_pipe_obj->params.in_width[ch], p_gst_pipe_obj->params.in_height[ch]);
    #endif
    
            p_gst_pipe_obj->m_srcElemArr[ch] = findElementByName(p_gst_pipe_obj->m_pipeline,
                                                                 p_gst_pipe_obj->params.m_AppSrcNameArr[ch]);
            if (p_gst_pipe_obj->m_srcElemArr[ch] == NULL)
            {
                printf("gst_wrapper: findElementByName() FAILED! %s not found\n", p_gst_pipe_obj->params.m_AppSrcNameArr[ch]);
                status = -1;
            }
            else
            {
                p_gst_pipe_obj->isAppSrc = GST_IS_APP_SRC(p_gst_pipe_obj->m_srcElemArr[ch]);
                if (p_gst_pipe_obj->isAppSrc)
                {
                    caps = gst_caps_new_simple("video/x-raw",
                                               "width", G_TYPE_INT, p_gst_pipe_obj->params.in_width,
                                               "height", G_TYPE_INT, p_gst_pipe_obj->params.in_height,
                                               "format", G_TYPE_STRING, p_gst_pipe_obj->params.in_format,
                                               NULL);
                    if (caps == NULL)
                    {
                        printf("gst_wrapper: gst_caps_new_simple() FAILED!\n");
                        status = -1;
                    }
                    gst_app_src_set_caps(GST_APP_SRC(p_gst_pipe_obj->m_srcElemArr[ch]), caps);
                    gst_caps_unref(caps);
                }
                else
                {
                    printf("gst_wrapper: %s not an AppSrc element\n", p_gst_pipe_obj->params.m_AppSrcNameArr[ch]);
                    status = -1;
                }
            }
        }
    
        /* Setup GstBuffers to push to AppSrc Elements using given data_ptrs */
        if (status == 0)
        {
            for (uint8_t idx = 0; idx < p_gst_pipe_obj->params.in_buffer_depth && status == 0; idx++)
            {
                for (uint8_t ch = 0; ch < p_gst_pipe_obj->params.in_num_channels && status == 0; ch++)
                {
                    p_gst_pipe_obj->buff[idx][ch] = gst_buffer_new();
                    #ifdef MOSAIC_PARAM
                    p_gst_pipe_obj->mem[idx][ch][0] = gst_memory_new_wrapped(0, data_ptr[idx][ch][0], plane_size[ch], 0, plane_size[ch], NULL, NULL);
                    p_gst_pipe_obj->mem[idx][ch][1] = gst_memory_new_wrapped(0, data_ptr[idx][ch][1], plane_size[ch] / 2, 0, plane_size[ch] / 2, NULL, NULL);
                    #endif
                    #ifndef MOSAIC_PARAM
                    p_gst_pipe_obj->mem[idx][ch][0] = gst_memory_new_wrapped(0, data_ptr[idx][ch][0], plane_size, 0, plane_size, NULL, NULL);
                    p_gst_pipe_obj->mem[idx][ch][1] = gst_memory_new_wrapped(0, data_ptr[idx][ch][1], plane_size / 2, 0, plane_size / 2, NULL, NULL);
                    #endif                
                    gst_buffer_append_memory(p_gst_pipe_obj->buff[idx][ch], p_gst_pipe_obj->mem[idx][ch][0]);
                    gst_buffer_append_memory(p_gst_pipe_obj->buff[idx][ch], p_gst_pipe_obj->mem[idx][ch][1]);
    
                    p_gst_pipe_obj->mem[idx][ch][0] = gst_buffer_get_memory(p_gst_pipe_obj->buff[idx][ch], 0);
                    p_gst_pipe_obj->mem[idx][ch][1] = gst_buffer_get_memory(p_gst_pipe_obj->buff[idx][ch], 1);
    
                    gst_memory_map(p_gst_pipe_obj->mem[idx][ch][0], &p_gst_pipe_obj->map_info[idx][ch][0], GST_MAP_WRITE);
                    gst_memory_map(p_gst_pipe_obj->mem[idx][ch][1], &p_gst_pipe_obj->map_info[idx][ch][1], GST_MAP_WRITE);
                }
            }
        }
    
        if (status == 0)
            p_gst_pipe_obj->push_count = 0;
    
        return status;
    }
    
    int32_t appGstSinkInit(void *(*data_ptr)[CODEC_MAX_NUM_CHANNELS][CODEC_MAX_NUM_PLANES])
    {
        int32_t status = 0;
        app_gst_wrapper_obj_t *p_gst_pipe_obj = &g_app_gst_wrapper_obj;
    
        /* Setup AppSink Elements */
        for (uint8_t ch = 0; ch < p_gst_pipe_obj->params.out_num_channels; ch++)
        {
            p_gst_pipe_obj->m_sinkElemArr[ch] = findElementByName(p_gst_pipe_obj->m_pipeline,
                                                                  p_gst_pipe_obj->params.m_AppSinkNameArr[ch]);
            if (p_gst_pipe_obj->m_sinkElemArr[ch] == NULL)
            {
                printf("gst_wrapper: findElementByName() FAILED! %s not found\n", p_gst_pipe_obj->params.m_AppSinkNameArr[ch]);
                status = -1;
            }
            else
            {
                p_gst_pipe_obj->isAppSink = GST_IS_APP_SINK(p_gst_pipe_obj->m_sinkElemArr[ch]);
                if (p_gst_pipe_obj->isAppSink == 0)
                {
                    printf("gst_wrapper: %s not an AppSink element\n", p_gst_pipe_obj->params.m_AppSinkNameArr[ch]);
                    status = -1;
                }
            }
        }
    
        /* Setup internal pointers to point to given data_ptrs, where the pulled data will be made available */
        if (status == 0)
        {
            p_gst_pipe_obj->pulled_data_ptr = data_ptr;
        }
    
        if (status == 0)
            p_gst_pipe_obj->pull_count = 0;
    
        return status;
    }
    
    int32_t appGstStart()
    {
        app_gst_wrapper_obj_t *p_gst_pipe_obj = &g_app_gst_wrapper_obj;
    
        /* Set pipeline state to PLAYING */
        GstStateChangeReturn ret = gst_element_set_state(p_gst_pipe_obj->m_pipeline, GST_STATE_PLAYING);
        if (ret == GST_STATE_CHANGE_FAILURE)
        {
            printf("gst_wrapper: gst_element_set_state() FAILED! ... GST pipe not playing.\n");
            return -1;
        }
        return 0;
    }
    
    int32_t appGstEnqAppSrc(uint8_t idx)
    {
        int32_t status = 0;
        app_gst_wrapper_obj_t *p_gst_pipe_obj = &g_app_gst_wrapper_obj;
    
        if (p_gst_pipe_obj->isAppSrc == 0)
        {
            printf("Gst Pipeline not initialised correctly: isAppSrc = 0 !\n");
            return -1;
        }
        for (uint8_t ch = 0; ch < p_gst_pipe_obj->params.in_num_channels; ch++)
        {
            gst_memory_unmap(p_gst_pipe_obj->mem[idx][ch][0], &p_gst_pipe_obj->map_info[idx][ch][0]);
            gst_memory_unmap(p_gst_pipe_obj->mem[idx][ch][1], &p_gst_pipe_obj->map_info[idx][ch][1]);
    
            if (status == 0)
            {
                GstFlowReturn ret;
    
                gst_buffer_ref(p_gst_pipe_obj->buff[idx][ch]);
                ret = gst_app_src_push_buffer(GST_APP_SRC(p_gst_pipe_obj->m_srcElemArr[ch]), p_gst_pipe_obj->buff[idx][ch]);
                if (ret != GST_FLOW_OK)
                {
                    printf("gst_wrapper: Pushing buffer to AppSrc returned %d instead of GST_FLOW_OK:%d\n", ret, GST_FLOW_OK);
                    status = -1;
                }
            }
        }
    
        if (status == 0)
            p_gst_pipe_obj->push_count++;
    
        return status;
    }
    
    int32_t appGstDeqAppSrc(uint8_t idx)
    {
        int32_t status = 0;
        app_gst_wrapper_obj_t *p_gst_pipe_obj = &g_app_gst_wrapper_obj;
        uint8_t refcount;
        if (p_gst_pipe_obj->isAppSrc == 0)
        {
            printf("Gst Pipeline not initialised correctly: isAppSrc = 0 !\n");
            return -1;
        }
        for (uint8_t ch = 0; ch < p_gst_pipe_obj->params.in_num_channels; ch++)
        {
            refcount = GST_MINI_OBJECT_REFCOUNT_VALUE(&p_gst_pipe_obj->buff[idx][ch]->mini_object);
            while (refcount > 1)
            {
                refcount = GST_MINI_OBJECT_REFCOUNT_VALUE(&p_gst_pipe_obj->buff[idx][ch]->mini_object);
            }
            refcount = GST_MINI_OBJECT_REFCOUNT_VALUE(&p_gst_pipe_obj->mem[idx][ch][1]->mini_object);
            while (refcount > 2)
            {
                refcount = GST_MINI_OBJECT_REFCOUNT_VALUE(&p_gst_pipe_obj->mem[idx][ch][1]->mini_object);
            }
            refcount = GST_MINI_OBJECT_REFCOUNT_VALUE(&p_gst_pipe_obj->mem[idx][ch][0]->mini_object);
            while (refcount > 2)
            {
                refcount = GST_MINI_OBJECT_REFCOUNT_VALUE(&p_gst_pipe_obj->mem[idx][ch][0]->mini_object);
            }
    
            if (!gst_memory_map(p_gst_pipe_obj->mem[idx][ch][0], &p_gst_pipe_obj->map_info[idx][ch][0], GST_MAP_WRITE))
            {
                status = -1;
                printf("Memory map error in GST Wrapper %d", __LINE__);
                break;
            }
            if (!gst_memory_map(p_gst_pipe_obj->mem[idx][ch][1], &p_gst_pipe_obj->map_info[idx][ch][1], GST_MAP_WRITE))
            {
                status = -1;
                printf("Memory map error in GST Wrapper %d", __LINE__);
                break;
            }
        }
        return status;
    }
    
    int32_t appGstEnqEosAppSrc()
    {
        GstFlowReturn ret;
        int32_t status = 0;
        app_gst_wrapper_obj_t *p_gst_pipe_obj = &g_app_gst_wrapper_obj;
        if (p_gst_pipe_obj->isAppSrc == 0)
        {
            printf("Gst Pipeline not initialised correctly: isAppSrc = 0 !\n");
            return -1;
        }
        for (uint8_t ch = 0; ch < p_gst_pipe_obj->params.in_num_channels; ch++)
        {
            printf("GST Wrapper: Pusing EOS to AppSRC %d \n", __LINE__);
            ret = gst_app_src_end_of_stream(GST_APP_SRC(p_gst_pipe_obj->m_srcElemArr[ch]));
            if (ret != GST_FLOW_OK)
            {
                printf("gst_wrapper: Pushing EOS to AppSrc returned %d instead of GST_FLOW_OK:%d\n", ret, GST_FLOW_OK);
                status = -1;
            }
        }
    
        // GstBus *bus = gst_element_get_bus(p_gst_pipe_obj->m_pipeline);
        // if (status == -1)
        #ifdef MOSAIC_PARAM
        {
    
            ret = gst_element_send_event(p_gst_pipe_obj->m_pipeline, gst_event_new_eos());
            if (ret != GST_FLOW_OK)
            {
                printf("gst_wrapper: Pushing EOS to AppSrc returned %d instead of GST_FLOW_OK:%d line no %d\n ", ret, GST_FLOW_OK, __LINE__);
                status = -1;
            }
            printf("gst_wrapper: Pushed EOS to Pipeline %d\n", __LINE__);
            // status=0;
        }
        #endif
        return status;
    }
    
    int32_t appGstDeqAppSink(uint8_t idx)
    {
        GstSample *out_sample = NULL;
        int32_t status = 0;
        app_gst_wrapper_obj_t *p_gst_pipe_obj = &g_app_gst_wrapper_obj;
    
        if (p_gst_pipe_obj->isAppSink == 0)
        {
            printf("Gst Pipeline not initialised correctly: isAppSink = 0 !\n");
            return -1;
        }
        for (uint8_t ch = 0; ch < p_gst_pipe_obj->params.out_num_channels; ch++)
        {
            /* Pull Sample from AppSink element */
            out_sample = gst_app_sink_try_pull_sample(GST_APP_SINK(p_gst_pipe_obj->m_sinkElemArr[ch]), GST_TIMEOUT);
            if (out_sample)
            {
                p_gst_pipe_obj->pulled_buff[idx][ch] = gst_sample_get_buffer(out_sample);
    
                status = exportgsttiovxbuffer(p_gst_pipe_obj->pulled_buff[idx][ch], (p_gst_pipe_obj->pulled_data_ptr)[idx][ch]);
                if (status != 0)
                {
                    printf("gst_wrapper: exportgsttiovxbuffer FAILED!\n");
                    break;
                }
    
                gst_buffer_ref(p_gst_pipe_obj->pulled_buff[idx][ch]);
                gst_sample_unref(out_sample);
            }
            else if (gst_app_sink_is_eos(GST_APP_SINK(p_gst_pipe_obj->m_sinkElemArr[ch])))
            {
                status = 1;
                break;
            }
            else
            {
                printf("gst_wrapper: WARNING: gst_app_sink_pull_sample() FAILED!\n");
                status = -1;
                break;
            }
        }
    
        if (status == 0)
            p_gst_pipe_obj->pull_count++;
    
        return status;
    }
    
    int32_t appGstEnqAppSink(uint8_t idx)
    {
        app_gst_wrapper_obj_t *p_gst_pipe_obj = &g_app_gst_wrapper_obj;
    
        if (p_gst_pipe_obj->isAppSink == 0)
        {
            printf("Gst Pipeline not initialised correctly: isAppSink = 0 !\n");
            return -1;
        }
        for (uint8_t ch = 0; ch < p_gst_pipe_obj->params.out_num_channels; ch++)
        {
            if (p_gst_pipe_obj->pulled_buff[idx][ch] != NULL)
            {
                for (int32_t p = 0; p < p_gst_pipe_obj->params.out_num_planes; p++)
                {
                    (p_gst_pipe_obj->pulled_data_ptr)[idx][ch][p] = NULL;
                }
                gst_buffer_unref(p_gst_pipe_obj->pulled_buff[idx][ch]);
                p_gst_pipe_obj->pulled_buff[idx][ch] = NULL;
            }
        }
        return 0;
    }
    
    int32_t appGstStop()
    {
        GstStateChangeReturn ret;
        app_gst_wrapper_obj_t *p_gst_pipe_obj = &g_app_gst_wrapper_obj;
        printf("GST Wrapper: Stop Codec called %d \n", __LINE__);
        // if (p_gst_pipe_obj->isAppSink == 0)
        {
            GstBus *bus;
            GstMessage *msg;
            bus = gst_element_get_bus(p_gst_pipe_obj->m_pipeline);
            printf("GST Wrapper: Stop Codec called %d \n ", __LINE__);
    
            //      uint8_t stat = gst_element_send_event(p_gst_pipe_obj->m_pipeline, gst_event_new_eos());
            //     if (stat != GST_FLOW_OK)
            //     {
            //         printf("gst_wrapper: Pushing EOS to AppSrc returned %d instead of GST_FLOW_OK:%d\n", stat, GST_FLOW_OK);
            //        stat = -1;
            //    }
            msg =
                gst_bus_timed_pop_filtered(bus, GST_CLOCK_TIME_NONE,
                                           GST_MESSAGE_ERROR | GST_MESSAGE_EOS);
            // msg =
            //     gst_bus_timed_pop_filtered(bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_EOS | GST_MESSAGE_ERROR | GST_MESSAGE_WARNING | GST_MESSAGE_INFO |
            //         GST_MESSAGE_TAG | GST_MESSAGE_BUFFERING | GST_MESSAGE_STATE_CHANGED | GST_MESSAGE_STATE_DIRTY |
            //          GST_MESSAGE_STEP_DONE | GST_MESSAGE_CLOCK_PROVIDE | GST_MESSAGE_CLOCK_LOST | GST_MESSAGE_NEW_CLOCK |
            //           GST_MESSAGE_STRUCTURE_CHANGE | GST_MESSAGE_STREAM_STATUS | GST_MESSAGE_APPLICATION | GST_MESSAGE_ELEMENT |
            //            GST_MESSAGE_SEGMENT_START | GST_MESSAGE_SEGMENT_DONE | GST_MESSAGE_DURATION_CHANGED | GST_MESSAGE_LATENCY |
            //             GST_MESSAGE_ASYNC_START | GST_MESSAGE_ASYNC_DONE | GST_MESSAGE_REQUEST_STATE | GST_MESSAGE_STEP_START |
            //              GST_MESSAGE_QOS | GST_MESSAGE_PROGRESS | GST_MESSAGE_TOC | GST_MESSAGE_RESET_TIME | GST_MESSAGE_STREAM_START
            //               | GST_MESSAGE_NEED_CONTEXT | GST_MESSAGE_HAVE_CONTEXT | GST_MESSAGE_EXTENDED | GST_MESSAGE_DEVICE_ADDED |
            //               GST_MESSAGE_DEVICE_REMOVED | GST_MESSAGE_ANY);
            printf("GST Wrapper: Stop Codec called %d \n", __LINE__);
    
            if (GST_MESSAGE_TYPE(msg) == GST_MESSAGE_EOS)
            {
                printf("gst_wrapper: Got EOS from pipeline!\n");
            }
            else if (GST_MESSAGE_TYPE(msg) == GST_MESSAGE_ERROR)
            {
                printf("gst_wrapper: An error occurred! Re-run with the GST_DEBUG=*:WARN environment "
                       "variable set for more details.");
            }
            printf("GST Wrapper: Stop Codec called %d \n", __LINE__);
            /* Free resources */
            gst_message_unref(msg);
            printf("GST Wrapper: Stop Codec called %d \n", __LINE__);
            gst_object_unref(bus);
            printf("GST Wrapper: Stop Codec called %d \n", __LINE__);
        }
    
        /* Set pipeline state to NULL */
        ret = gst_element_set_state(p_gst_pipe_obj->m_pipeline, GST_STATE_NULL);
        if (ret == GST_STATE_CHANGE_FAILURE)
        {
            printf("gst_wrapper: GST pipe set state NULL failed.\n");
            return -1;
        }
        return 0;
    }
    
    void appGstDeInit()
    {
        app_gst_wrapper_obj_t *p_gst_pipe_obj = &g_app_gst_wrapper_obj;
    
        if (p_gst_pipe_obj->isAppSink)
        {
            for (uint8_t ch = 0; ch < p_gst_pipe_obj->params.out_num_channels; ch++)
            {
                gst_object_unref(p_gst_pipe_obj->m_sinkElemArr[ch]);
            }
        }
        if (p_gst_pipe_obj->isAppSrc)
        {
            for (uint8_t ch = 0; ch < p_gst_pipe_obj->params.in_num_channels; ch++)
            {
                for (uint8_t idx = 0; idx < p_gst_pipe_obj->params.in_buffer_depth; idx++)
                {
                    gst_memory_unmap(p_gst_pipe_obj->mem[idx][ch][0], &p_gst_pipe_obj->map_info[idx][ch][0]);
                    gst_memory_unmap(p_gst_pipe_obj->mem[idx][ch][1], &p_gst_pipe_obj->map_info[idx][ch][1]);
    
                    gst_memory_unref(p_gst_pipe_obj->mem[idx][ch][0]);
                    gst_memory_unref(p_gst_pipe_obj->mem[idx][ch][1]);
    
                    gst_buffer_unref(p_gst_pipe_obj->buff[idx][ch]);
                }
                gst_object_unref(p_gst_pipe_obj->m_srcElemArr[ch]);
            }
        }
        gst_object_unref(p_gst_pipe_obj->m_pipeline);
    }
    
    void appGstPrintStats()
    {
        app_gst_wrapper_obj_t *p_gst_pipe_obj = &g_app_gst_wrapper_obj;
    
        printf("GST_WRAPPER PUSH/PULL COUNTS:\n");
        printf("\n");
        printf("Push count : %d\n", p_gst_pipe_obj->push_count);
        printf("Pull count : %d\n", p_gst_pipe_obj->pull_count);
        printf("\n");
    }
    
    /* Frame generation for testing */
    
    void generate_test_pattern(guchar *data, guint size, guint frame_num, gboolean red)
    {
    
        memset(data, 0, size);
    
        for (int y = 0; y < HEIGHT; y++)
        {
    
            for (int x = 0; x < WIDTH; x++)
            {
    
                int offset = (y * WIDTH + x) * 3;
    
                data[offset + 0] = red ? 255 : 0; // R
    
                data[offset + 1] = 0; // G
    
                data[offset + 2] = red ? 0 : 255; // B
            }
        }
    }
    
    /* Push video frames to appsrc */
    
    gboolean push_data(GstElement *appsrc, gboolean red, guint *frame_count)
    {
    
        GstBuffer *buffer;
    
        GstFlowReturn ret;
    
        GstMapInfo map;
    
        guint size = WIDTH * HEIGHT * 3;
    
        buffer = gst_buffer_new_allocate(NULL, size, NULL);
    
        gst_buffer_map(buffer, &map, GST_MAP_WRITE);
    
        generate_test_pattern(map.data, size, *frame_count, red);
    
        gst_buffer_unmap(buffer, &map);
    
        GST_BUFFER_PTS(buffer) = gst_util_uint64_scale(*frame_count, GST_SECOND, 30);
    
        GST_BUFFER_DURATION(buffer) = gst_util_uint64_scale(1, GST_SECOND, 30);
    
        g_signal_emit_by_name(appsrc, "push-buffer", buffer, &ret);
    
        gst_buffer_unref(buffer);
    
        (*frame_count)++;
    
        return ret == GST_FLOW_OK;
    }
    
    gboolean feed_data(gpointer user_data)
    {
    
        static guint frame_count = 0;
    
        GstElement **sources = (GstElement **)user_data;
    
        if (!push_data(sources[0], TRUE, &frame_count))
            return G_SOURCE_REMOVE;
    #if 0
        if (!push_data(sources[1], FALSE, &frame_count))
            return G_SOURCE_REMOVE;
        if (!push_data(sources[2], TRUE, &frame_count))
            return G_SOURCE_REMOVE;
    
        if (!push_data(sources[3], FALSE, &frame_count))
            return G_SOURCE_REMOVE;
        if (!push_data(sources[4], TRUE, &frame_count))
            return G_SOURCE_REMOVE;
    #endif
    
        return G_SOURCE_CONTINUE;
    }
    
    int gst_mix_appsrc()
    {
        app_gst_wrapper_obj_t *p_gst_pipe_obj = &g_app_gst_wrapper_obj;
        printf("\n\n\n**************Inside mix appsrc wrapper function ************************\n\n\n");
        gst_init(NULL, NULL);
    
        GstElement *pipeline = gst_pipeline_new("video-mix-pipeline");
    
        GstElement *appsrc1 = gst_element_factory_make("appsrc", "appsrc1");
        p_gst_pipe_obj->m_srcElemArr[0] = appsrc1;
        p_gst_pipe_obj->m_pipeline = pipeline;
        // p_gst_pipe_obj->params.out_width;
    
    #if 0
        GstElement *appsrc2 = gst_element_factory_make("appsrc", "appsrc2");
        GstElement *appsrc3 = gst_element_factory_make("appsrc", "appsrc3");
    
        GstElement *appsrc4 = gst_element_factory_make("appsrc", "appsrc4");
        GstElement *appsrc5 = gst_element_factory_make("appsrc", "appsrc5");
    #endif
    
        GstElement *videoconvert1 = gst_element_factory_make("videoconvert", "vc1");
    #if 0
        GstElement *videoconvert2 = gst_element_factory_make("videoconvert", "vc2");
        GstElement *videoconvert3 = gst_element_factory_make("videoconvert", "vc3");
    
        GstElement *videoconvert4 = gst_element_factory_make("videoconvert", "vc4");
        GstElement *videoconvert5 = gst_element_factory_make("videoconvert", "vc5");
    
    #endif
        GstElement *queue1 = gst_element_factory_make("queue", "queue1");
    #if 0    
        GstElement *queue2   = gst_element_factory_make("queue", "queue2");
        GstElement *queue3   = gst_element_factory_make("queue", "queue3");
        GstElement *queue4   = gst_element_factory_make("queue", "queue4");
        GstElement *queue5   = gst_element_factory_make("queue", "queue5");
    #endif
    
        GstElement *compositor = gst_element_factory_make("compositor", "compositor");
    
        // Encoding and output
        GstElement *encoder = gst_element_factory_make("jpegenc", "jpeg-encoder");
        GstElement *sink = gst_element_factory_make("multifilesink", "file-output");
        /*
        GstElement *videoconvert_out = gst_element_factory_make("videoconvert", "vc_out");
        GstElement *encoder      = gst_element_factory_make("v4l2h264enc", "h264encoder");
        GstElement *muxer        = gst_element_factory_make("mp4mux", "muxer");
    
        GstElement *sink = gst_element_factory_make("filesink", "sink");
        */
        GstPad *pad1 = gst_element_request_pad_simple(compositor, "sink_0");
    #if 0     
        GstPad *pad2 = gst_element_request_pad_simple(compositor, "sink_1");
        GstPad *pad3 = gst_element_request_pad_simple(compositor, "sink_2");
        GstPad *pad4 = gst_element_request_pad_simple(compositor, "sink_3");
        GstPad *pad5 = gst_element_request_pad_simple(compositor, "sink_4");
    #endif
    
        g_object_set(pad1, "xpos", 0, "ypos", 0, NULL); // Top-left
    #if 0    
        g_object_set(pad2, "xpos", 320, "ypos", 0, NULL);    
        g_object_set(pad3, "xpos", 640, "ypos", 0, NULL);           // Shifted right
        g_object_set(pad4, "xpos", 240, "ypos", 0, NULL);  
        g_object_set(pad5, "xpos", 240, "ypos", 320, NULL);
    #endif
    
        // if (!pipeline || !appsrc1 || !appsrc2 ||!appsrc3 || !appsrc4 ||!appsrc5 ||  !videoconvert1 || !videoconvert2 ||!videoconvert3 || !videoconvert4 ||!videoconvert5 || !compositor || !queue1 ||!queue2 ||!queue3 ||!queue4 ||!queue5 ||  !compositor ||!encoder || !sink)
        if (!pipeline || !appsrc1 || !videoconvert1 || !compositor || !queue1 || !compositor || !encoder || !sink)
    
        {
    
            g_printerr("Failed to create elements.\n");
    
            return -1;
        }
    
        g_object_set(G_OBJECT(sink), "location", "frame-%03d.jpg", NULL);
    
        printf("\n\n *******Setting caps parameter in wrapper function ******************\n\n\n\n");
        // Set caps for appsrc (RGB)
    
        GstCaps *caps = gst_caps_new_simple("video/x-raw",
    
                                            "format", G_TYPE_STRING, "RGB",
    
                                            "width", G_TYPE_INT, WIDTH,
    
                                            "height", G_TYPE_INT, HEIGHT,
    
                                            "framerate", GST_TYPE_FRACTION, 30, 1,
    
                                            NULL);
    
        // Add and link
        /*
    gst_bin_add_many(GST_BIN(pipeline),
    appsrc1, videoconvert1, queue1,
    appsrc2, videoconvert2, queue2,
    appsrc3, videoconvert3, queue3,
    appsrc4, videoconvert4, queue4,
    appsrc5, videoconvert5, queue5,
    compositor, encoder, sink, NULL); */
        gst_bin_add_many(GST_BIN(pipeline),
                         appsrc1, videoconvert1, queue1,
    
                         compositor, encoder, sink, NULL);
        /*
           if (!gst_element_link_many(appsrc1, videoconvert1, queue1, compositor, NULL) ||
               !gst_element_link_many(appsrc2, videoconvert2, queue2, compositor, NULL) ||
               !gst_element_link_many(appsrc3, videoconvert3, queue3, compositor, NULL) ||
               !gst_element_link_many(appsrc4, videoconvert4, queue4, compositor, NULL) ||
               !gst_element_link_many(appsrc5, videoconvert5, queue5, compositor, NULL) ||
               !gst_element_link_many(compositor, encoder, sink, NULL)) {
               g_printerr("Failed to link elements.\n");
               return -1;
           }
               */
        if (!gst_element_link_many(appsrc1, videoconvert1, queue1, compositor, NULL) ||
    
            !gst_element_link_many(compositor, encoder, sink, NULL))
        {
            g_printerr("Failed to link elements.\n");
            return -1;
        }
    
        g_object_set(G_OBJECT(appsrc1), "caps", caps, "is-live", TRUE, "format", GST_FORMAT_TIME, NULL);
    #if 0
        g_object_set(G_OBJECT(appsrc2), "caps", caps, "is-live", TRUE, "format", GST_FORMAT_TIME, NULL);
        g_object_set(G_OBJECT(appsrc3), "caps", caps, "is-live", TRUE, "format", GST_FORMAT_TIME, NULL);
        g_object_set(G_OBJECT(appsrc4), "caps", caps, "is-live", TRUE, "format", GST_FORMAT_TIME, NULL);
        g_object_set(G_OBJECT(appsrc5), "caps", caps, "is-live", TRUE, "format", GST_FORMAT_TIME, NULL);
    #endif
    
        // gst_caps_unref(caps);
    
        gst_element_set_state(pipeline, GST_STATE_PLAYING);
    
        // GstElement *sources[] = {appsrc1, appsrc2,appsrc3, appsrc4,appsrc5};
        /*
         GstElement *sources[] = {appsrc1};
    
         g_timeout_add(1000 / 30, (GSourceFunc)feed_data, sources);
    
         GMainLoop *loop = g_main_loop_new(NULL, FALSE);
    
         g_main_loop_run(loop);
    
         gst_element_set_state(pipeline, GST_STATE_NULL);
    
         gst_object_unref(pipeline);
    
         g_main_loop_unref(loop);
         */
    
        return 0;
    }
    
    #if 0
    int32_t appGstSrcInit(void *data_ptr[CODEC_MAX_BUFFER_DEPTH][CODEC_MAX_NUM_CHANNELS][CODEC_MAX_NUM_PLANES])
    {
        int32_t status = 0;
        GstCaps *caps = NULL;
        app_gst_wrapper_obj_t *p_gst_pipe_obj = &g_app_gst_wrapper_obj;
        uint32_t plane_size = p_gst_pipe_obj->params.in_width * p_gst_pipe_obj->params.in_height;
    
        /* Setup AppSrc Elements */
        for (uint8_t ch = 0; ch < p_gst_pipe_obj->params.in_num_channels; ch++)
        {
            p_gst_pipe_obj->m_srcElemArr[ch] = appsrc1;
            if (p_gst_pipe_obj->m_srcElemArr[ch] == NULL)
            {
                printf("gst_wrapper: findElementByName() FAILED! %s not found\n", p_gst_pipe_obj->params.m_AppSrcNameArr[ch]);
                status = -1;
            }
            else
            {
                p_gst_pipe_obj->isAppSrc = GST_IS_APP_SRC(p_gst_pipe_obj->m_srcElemArr[ch]);
                if (p_gst_pipe_obj->isAppSrc)
                {
                    caps = gst_caps_new_simple("video/x-raw",
                                               "width", G_TYPE_INT, p_gst_pipe_obj->params.in_width,
                                               "height", G_TYPE_INT, p_gst_pipe_obj->params.in_height,
                                               "format", G_TYPE_STRING, p_gst_pipe_obj->params.in_format,
                                               NULL);
                    if (caps == NULL)
                    {
                        printf("gst_wrapper: gst_caps_new_simple() FAILED!\n");
                        status = -1;
                    }
                    gst_app_src_set_caps(GST_APP_SRC(p_gst_pipe_obj->m_srcElemArr[ch]), caps);
                    gst_caps_unref(caps);
                }
                else
                {
                    printf("gst_wrapper: %s not an AppSrc element\n", p_gst_pipe_obj->params.m_AppSrcNameArr[ch]);
                    status = -1;
                }
            }
        }
    
        /* Setup GstBuffers to push to AppSrc Elements using given data_ptrs */
        if (status == 0)
        {
            for (uint8_t idx = 0; idx < p_gst_pipe_obj->params.in_buffer_depth && status == 0; idx++)
            {
                for (uint8_t ch = 0; ch < p_gst_pipe_obj->params.in_num_channels && status == 0; ch++)
                {
                    p_gst_pipe_obj->buff[idx][ch] = gst_buffer_new();
    
                    p_gst_pipe_obj->mem[idx][ch][0] = gst_memory_new_wrapped(0, data_ptr[idx][ch][0], plane_size, 0, plane_size, NULL, NULL);
                    p_gst_pipe_obj->mem[idx][ch][1] = gst_memory_new_wrapped(0, data_ptr[idx][ch][1], plane_size / 2, 0, plane_size / 2, NULL, NULL);
    
                    gst_buffer_append_memory(p_gst_pipe_obj->buff[idx][ch], p_gst_pipe_obj->mem[idx][ch][0]);
                    gst_buffer_append_memory(p_gst_pipe_obj->buff[idx][ch], p_gst_pipe_obj->mem[idx][ch][1]);
    
                    p_gst_pipe_obj->mem[idx][ch][0] = gst_buffer_get_memory(p_gst_pipe_obj->buff[idx][ch], 0);
                    p_gst_pipe_obj->mem[idx][ch][1] = gst_buffer_get_memory(p_gst_pipe_obj->buff[idx][ch], 1);
    
                    gst_memory_map(p_gst_pipe_obj->mem[idx][ch][0], &p_gst_pipe_obj->map_info[idx][ch][0], GST_MAP_WRITE);
                    gst_memory_map(p_gst_pipe_obj->mem[idx][ch][1], &p_gst_pipe_obj->map_info[idx][ch][1], GST_MAP_WRITE);
                }
            }
        }
    
        if (status == 0)
            p_gst_pipe_obj->push_count = 0;
    
        return status;
    }
    #endif

  • Hi Suresh,

    Thanks for the reply. i want to combine multiple videos with different resolutions for appsrc inputs,  and followed gst_wrapper.c file available in RTOS SDK 11_00_00_06.  in this , input width and height are configured for only one channel. for different resolutions for individual channels of appsrc, i have updated the function appGstSrcInit in gst_wrapper c  as below.  updated in_height and in_width as an array element to hold different values for each channels.

    You shouldn't have to change any source code to use tiovxmosaic with different resolution inputs.

    The above command is stopped at second iteration of Enqueuing the dataptr idx values, but same command is working fine and generating a output file, when i configured all width and height as 1280 and 720 respectively.

    Can you add a gst cap after the mosaic?

    Also, please suggest how to check h264 encoder hardware load in TDA4.

    Please open a different thread for this question.

    Best,
    Jared

  • Hi Jared McArthur,

    Thanks for the inputs. After updating gst caps for tiovxmosaic sinks, i am able to get an output file generated as required. 

    Regards,

    Suresh K

  • Hi Suresh,

    Closing thread.

    Best,
    Jared

  • Hi Jared McArthur,

    Gstremer and tiovxmosaic is supporting different resolution inputs.

    when using gst_wrapper function ,  it has only one input resolution parameter for all channel inputs.

    in gst_wrapper.c file,  plane size for all channels are configured as below.

    int32_t appGstSrcInit(void* data_ptr[CODEC_MAX_BUFFER_DEPTH][CODEC_MAX_NUM_CHANNELS][CODEC_MAX_NUM_PLANES])
    {
        int32_t status = 0;
        GstCaps* caps = NULL;
        app_gst_wrapper_obj_t* p_gst_pipe_obj = &g_app_gst_wrapper_obj;
        uint32_t plane_size = p_gst_pipe_obj->params.in_width * p_gst_pipe_obj->params.in_height;
    
        /* Setup AppSrc Elements */
        for (uint8_t ch = 0; ch < p_gst_pipe_obj->params.in_num_channels; ch++)
        {
            p_gst_pipe_obj->m_srcElemArr[ch]  = findElementByName(p_gst_pipe_obj->m_pipeline, 
                                                        p_gst_pipe_obj->params.m_AppSrcNameArr[ch]);
            if (p_gst_pipe_obj->m_srcElemArr[ch] == NULL)
            {
                printf("gst_wrapper: findElementByName() FAILED! %s not found\n", p_gst_pipe_obj->params.m_AppSrcNameArr[ch]);
                status = -1;
            }
            else
            {
                p_gst_pipe_obj->isAppSrc = GST_IS_APP_SRC(p_gst_pipe_obj->m_srcElemArr[ch]);
                if (p_gst_pipe_obj->isAppSrc)
                {
                    caps = gst_caps_new_simple("video/x-raw",
                                            "width", G_TYPE_INT, p_gst_pipe_obj->params.in_width,
                                            "height", G_TYPE_INT, p_gst_pipe_obj->params.in_height,
                                            "format", G_TYPE_STRING, p_gst_pipe_obj->params.in_format,
                                            NULL);
                    if (caps == NULL)
                    {
                        printf("gst_wrapper: gst_caps_new_simple() FAILED!\n");
                        status = -1;
                    }
                    gst_app_src_set_caps (GST_APP_SRC(p_gst_pipe_obj->m_srcElemArr[ch]), caps);
                    gst_caps_unref (caps);
                }
                else 
                {
                    printf("gst_wrapper: %s not an AppSrc element\n", p_gst_pipe_obj->params.m_AppSrcNameArr[ch]);
                    status = -1;
                }
            }
        }
    
        /* Setup GstBuffers to push to AppSrc Elements using given data_ptrs */
        if (status==0)
        {
            for (uint8_t idx = 0; idx < p_gst_pipe_obj->params.in_buffer_depth && status==0; idx++)
            {
                for (uint8_t ch = 0; ch < p_gst_pipe_obj->params.in_num_channels && status==0; ch++)
                {
                    p_gst_pipe_obj->buff[idx][ch] = gst_buffer_new();
    
                    p_gst_pipe_obj->mem[idx][ch][0] = gst_memory_new_wrapped (0, data_ptr[idx][ch][0], plane_size, 0, plane_size, NULL, NULL);
                    p_gst_pipe_obj->mem[idx][ch][1] = gst_memory_new_wrapped (0, data_ptr[idx][ch][1], plane_size/2, 0, plane_size/2, NULL, NULL);
    
                    gst_buffer_append_memory (p_gst_pipe_obj->buff[idx][ch], p_gst_pipe_obj->mem[idx][ch][0]);
                    gst_buffer_append_memory (p_gst_pipe_obj->buff[idx][ch], p_gst_pipe_obj->mem[idx][ch][1]);
    
                    p_gst_pipe_obj->mem[idx][ch][0] = gst_buffer_get_memory (p_gst_pipe_obj->buff[idx][ch], 0);
                    p_gst_pipe_obj->mem[idx][ch][1] = gst_buffer_get_memory (p_gst_pipe_obj->buff[idx][ch], 1);
    
                    gst_memory_map(p_gst_pipe_obj->mem[idx][ch][0],&p_gst_pipe_obj->map_info[idx][ch][0], GST_MAP_WRITE);
                    gst_memory_map(p_gst_pipe_obj->mem[idx][ch][1],&p_gst_pipe_obj->map_info[idx][ch][1], GST_MAP_WRITE);
                }
            }
        }
    
        if (status==0) p_gst_pipe_obj->push_count = 0;
    
        return status;
    }
    
    

    as the above functions will support for single input resolutions for different channels of appsrc's , Please clarify how to proceed further if input resolutions are different for each channels of appsrc..

    i want to combine  appsrc inputs having different resolutions using gst_wrapper functions.

  • Hi Suresh,

    This question has moved away from the original topic of the thread. Please open a new thread.

    Best,
    Jared