This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TDA4AH-Q1: H264 Encoder for Mosaic output- some frames are flickering

Part Number: TDA4AH-Q1

Tool/software:

we are using TDA4AH with rtos sdk with linux (j784s4-evm-09_02_00_05) for software development, we need to have a H264 encoded video output for mosaic output of five videos combined.H264 encoded video is flickering due to corruption of few frames.

configuration is :frame rate is 30 fps, we are capturing 500 continuous frames, when all frames are saved as yuv images, there is no corruption in images, but when saving video file with H264 encoded video, some frames are corrupted and video is flickering. Please refer attached video.

we have followed the example code available in "apps/vision_apps/multicam-codec" RTOS SDK. do we need any specific settings for mosaic output to be encoded with H264? Please suggest.

  • Adding output video for reference

  • Hi Suresh, 

    Will take a look at this when b/w frees up.

    Thank you,
    Sarabesh S.

  • Hi Suresh,

    Could you please provide me with any patches you've made to the kernel or the application.

    Thanks,
    Sarabesh S.

  • Hi Sarabesh, 

    I'll be replying on this thread from now on.

    Let me explain the current scenario.

    I am having a tiovx graph with only one node that is mosaic node. Mosaic node is having 5 inputs(vx_object_ararays). For now, I have replaced these input arrays with static arrays having hardcoded pixel values(pitch black images). I have made the mosaic output as graph parameter and mapping it to the encoder input buffer. Attaching code snippet and other things below:

    mosaic node input: 

     vx_image sample_img = vxCreateImage(h264_graph_obj_ptr->context, h264_graph_obj_ptr->enc_pool.width,
                                            h264_graph_obj_ptr->enc_pool.height, VX_DF_IMAGE_NV12);
        status = vxGetStatus((vx_reference)sample_img);
        for (int idx2 = 0; idx2 < num_mosaic_inputs; idx2++)
        {
            for (int idx = 0; idx < h264_graph_obj_ptr->enc_pool.bufq_depth; idx++)
            {
                h264_graph_obj_ptr->test_arr[idx] = vxCreateObjectArray(h264_graph_obj_ptr->context, (vx_reference)sample_img,
                                                                              h264_graph_obj_ptr->num_ch);
                test_img = (vx_image)vxGetObjectArrayItem(h264_graph_obj_ptr->test_arr[idx], 0);
                for (plane_id = 0; plane_id < 2; plane_id++)
                {
                    status = vxMapImagePatch(test_img,
                                             &img_rect,
                                             plane_id,
                                             &map_id,
                                             &nv12_img_addr,
                                             &nv12_img_data_ptr,
                                             VX_READ_ONLY,
                                             VX_MEMORY_TYPE_HOST,
                                             VX_NOGAP_X);
    
                    if (status != VX_SUCCESS)
                    {
                        printf("Error: Map Image patch failed \n");
                        return status;
                    }
                    else
                    {
                        /* set black level as background color of display channels */
                        for (uint32_t y = 0; y < MOSAIC_OUT_HEIGHT; y++)
                        {
                            for (uint32_t x = 0; x < MOSAIC_OUT_WIDTH; x++)
                            {
                                pixel_val_u8 = vxFormatImagePatchAddress2d(nv12_img_data_ptr, x, y, &nv12_img_addr);
                                if (plane_id == 0) // y
                                {
                                    /*create a testing checker image*/
                                    pixel_val_u8[0] = 10; // black value
                                    pixel_val_u8[1] = 10;
                                }
                                else // uv
                                {
                                    /*create a testing checker image*/
                                    pixel_val_u8[0] = 128;
                                    pixel_val_u8[1] = 128;
                                }
                            }
                        }
                        status = vxUnmapImagePatch(test_img, map_id);
                        if (status != VX_SUCCESS)
                        {
                            printf("Error: Unmap Image patch failed \n");
                            return status;
                        }
                    }
                }
                h264_graph_obj_ptr->h264_imgMosaicObj.input_arr[idx] = h264_graph_obj_ptr->test_arr[idx];
            }
        }
        vxReleaseImage(&sample_img);

    graph parameter config:

    idx = 0;
        status = add_graph_parameter_by_node_index(h264_graph_obj_ptr->graph, h264_graph_obj_ptr->h264_imgMosaicObj.node, 1); // lm: idx 0 -> 1 using output image not config
        /* Set graph schedule config such that graph parameter @index is enqueuable */
        h264_graph_obj_ptr->h264_imgMosaicObj.graph_parameter_index = idx;
        graph_parameters_queue_params_list[idx].graph_parameter_index = idx;
        graph_parameters_queue_params_list[idx].refs_list_size = h264_graph_obj_ptr->enc_pool.bufq_depth; // lm: changed from GRAPH_PRM_REF_LIST_SIZE(=4) to enc_pool.bufq_depth(=6)
        graph_parameters_queue_params_list[idx].refs_list = (vx_reference *)&h264_graph_obj_ptr->h264_imgMosaicObj.output_image[0];
        idx++;

    codec init & start:

            sprintf(NewFileName, "output_video_%d.mp4", outer_loop);
            set_codec_pipe_params(h264_graph_obj_ptr, NewFileName);
            printf("\n\nCommand string is ******%s ******\n\n", h264_graph_obj_ptr->codec_pipe_params.m_cmdString);
            status = appCodecInit(&h264_graph_obj_ptr->codec_pipe_params);
            if (status == VX_SUCCESS)
            {
                printf("Codec initialization is success!! ");
            }
            else
            {
                printf("Codec initialization is failed!\n %d", status);
            }
            for (vx_int8 buf_id = 0; buf_id < h264_graph_obj_ptr->enc_pool.bufq_depth; buf_id++)
            {
                if (VX_SUCCESS == status)
                {
                    status = map_vx_object_arr(h264_graph_obj_ptr->enc_pool.arr[buf_id], h264_graph_obj_ptr->enc_pool.data_ptr[buf_id],
                                               h264_graph_obj_ptr->enc_pool.map_id[buf_id], h264_graph_obj_ptr->enc_pool.num_channels);
    #ifdef TEST_ENC_STATIC_MOSAIC_OUTPUT
                    status = map_vx_image(test_img[buf_id], h264_graph_obj_ptr->enc_pool.data_ptr[buf_id],
                                          h264_graph_obj_ptr->enc_pool.map_id[buf_id]);
    #endif
                    {
                        if (status == VX_SUCCESS)
                        {
                            printf("mapVX Object array is success! \t %d\n", __LINE__);
                        }
                        else
                        {
                            printf("mapVX Object array is FAILED! \t %d\n", __LINE__);
                        }
                    }
                }
            }
            if (VX_SUCCESS == status)
            {
                status = appCodecSrcInit(h264_graph_obj_ptr->enc_pool.data_ptr);
                if (VX_SUCCESS == status)
                {
                    printf("\nappCodecSrcInit Done!\n");
                }
                else
                {
                    printf("\nappCodecSrcInit Failed!\n");
                }
            }
            for (vx_int8 buf_id = 0; buf_id < h264_graph_obj_ptr->enc_pool.bufq_depth; buf_id++)
            {
                if (VX_SUCCESS == status)
                {
                    status = unmap_vx_object_arr((vx_object_array)h264_graph_obj_ptr->enc_pool.arr[buf_id], h264_graph_obj_ptr->enc_pool.map_id[buf_id], h264_graph_obj_ptr->enc_pool.num_channels);
    #ifdef TEST_ENC_STATIC_MOSAIC_OUTPUT
                    status = unmap_vx_image(test_img[buf_id], h264_graph_obj_ptr->enc_pool.map_id[buf_id]);
    #endif
                    if (status == VX_SUCCESS)
                    {
                        printf("UnmapVX Object array is success! \t %d\n", __LINE__);
                    }
                    else
                    {
                        printf("UnmapVX Object array is FAILED! \t %d\n", __LINE__);
                    }
                }
            }
            if (status == VX_SUCCESS)
            {
                if (h264_graph_obj_ptr->encode)
                {
                    status = appCodecStart();
                    printf("appCodecStart Done!\n");
                }
            }

    linking mosaic output and encoder input:

     /*Linking mosaic output images with encoder pool array references*/
        for (int buf_id = 0; buf_id < h264_graph_obj_ptr->enc_pool.bufq_depth; buf_id++)
        {
            h264_graph_obj_ptr->h264_imgMosaicObj.output_image[buf_id] = (vx_image)vxGetObjectArrayItem(h264_graph_obj_ptr->enc_pool.arr[buf_id], 0);
        }

    frame loop enq deq:

    for (i = 0; i < 500; i++)
            {
                if (i >= h264_graph_obj_ptr->enc_pool.bufq_depth)
                {
    #if 1
                    appPerfPointBegin(&h264_graph_obj_ptr->graph_perf);
                    start_time = tivxPlatformGetTimeInUsecs();
                    /* Dequeue one image buffer */
                    if (status != VX_SUCCESS)
                    {
                        GEN3_DBG_PRINTF_ERR("ERROR: graph dequeue failed for capt cl2 cl4 out arr[%d], status = %d\n", buf_id, status);
                    }
    #endif
                    appPerfPointEnd(&h264_graph_obj_ptr->graph_perf);
                    if (i % 50 == 0)
                    {
                        appPerfPointPrintFPS(&h264_graph_obj_ptr->graph_perf);
                        appPerfPointReset(&h264_graph_obj_ptr->graph_perf);
                        printf("mosaic time taken %lu\n", tivxPlatformGetTimeInUsecs() - start_time);
                    }
                    if (i >= h264_graph_obj_ptr->enc_pool.bufq_depth)
                    {
                        status = appCodecDeqAppSrc(h264_graph_obj_ptr->mosaic_enq_id);
                        if (status != VX_SUCCESS)
                        {
                            printf("App Codec DEqueue App src is  FAILED!\n");
                        }
                        else
                        {
                            status = unmap_vx_object_arr((vx_object_array)h264_graph_obj_ptr->enc_pool.arr[h264_graph_obj_ptr->mosaic_enq_id],
                                                         h264_graph_obj_ptr->enc_pool.map_id[h264_graph_obj_ptr->mosaic_enq_id], h264_graph_obj_ptr->num_ch);
    #ifdef TEST_ENC_STATIC_MOSAIC_OUTPUT
                            status = unmap_vx_image(test_img[h264_graph_obj_ptr->mosaic_enq_id],
                                                    h264_graph_obj_ptr->enc_pool.map_id[h264_graph_obj_ptr->mosaic_enq_id]);
    #endif
                        }
                    }
                }
    #if 1
                /*output*/
                status = vxGraphParameterEnqueueReadyRef(h264_graph_obj_ptr->graph,
                                                         5,
                                                         (vx_reference *)&h264_graph_obj_ptr->h264_imgMosaicObj.output_image[h264_graph_obj_ptr->mosaic_enq_id], // lm: changed [buf_id] -> [h264_graph_obj_ptr->mosaic_enq_id]
                                                         1);
                if (status != VX_SUCCESS)
                {
                    GEN3_DBG_PRINTF_ERR("ERROR: graph enqueue failed for capt cl2 cl4 out arr[%d], status = %d\n", buf_id, status);
                }
                {
                    status = map_vx_object_arr((vx_object_array)h264_graph_obj_ptr->enc_pool.arr[h264_graph_obj_ptr->appsrc_push_id],
                                               h264_graph_obj_ptr->enc_pool.data_ptr[h264_graph_obj_ptr->appsrc_push_id],
                                               h264_graph_obj_ptr->enc_pool.map_id[h264_graph_obj_ptr->appsrc_push_id], h264_graph_obj_ptr->num_ch);
    #ifdef TEST_ENC_STATIC_MOSAIC_OUTPUT
                    status = map_vx_image(test_img[h264_graph_obj_ptr->appsrc_push_id],
                                          h264_graph_obj_ptr->enc_pool.data_ptr[h264_graph_obj_ptr->appsrc_push_id],
                                          h264_graph_obj_ptr->enc_pool.map_id[h264_graph_obj_ptr->appsrc_push_id]);
    #endif
                }
                if (status == VX_SUCCESS)
                {
                    status = appCodecEnqAppSrc(h264_graph_obj_ptr->mosaic_enq_id);
                    {
                        if (status != VX_SUCCESS)
                        {
                            printf("pp Codec Enqueue App src is  FAILED!\n");
                        }
                    }
                }
                h264_graph_obj_ptr->appsrc_push_id++;
                h264_graph_obj_ptr->appsrc_push_id = (h264_graph_obj_ptr->appsrc_push_id >= h264_graph_obj_ptr->enc_pool.bufq_depth) ? 0 : h264_graph_obj_ptr->appsrc_push_id;
                h264_graph_obj_ptr->mosaic_enq_id++;
                h264_graph_obj_ptr->mosaic_enq_id = (h264_graph_obj_ptr->mosaic_enq_id >= h264_graph_obj_ptr->enc_pool.bufq_depth) ? 0 : h264_graph_obj_ptr->mosaic_enq_id;
    
            } // End of for loop
            first_file = 0;
            for (i = 0; i < h264_graph_obj_ptr->enc_pool.bufq_depth; i++)
            {
    #if 1
                status = vxGraphParameterDequeueDoneRef(h264_graph_obj_ptr->graph,
                                                        h264_graph_obj_ptr->h264_imgMosaicObj.graph_parameter_index,
                                                        (vx_reference *)&image_temp, 1, &num_refs);
                if (status != VX_SUCCESS)
                {
                    GEN3_DBG_PRINTF_ERR("ERROR: graph dequeue failed for capt cl2 cl4 out arr[%d], status = %d\n", buf_id, status);
                }
                value++;
    #endif
                status = appCodecDeqAppSrc(h264_graph_obj_ptr->mosaic_enq_id);
                unmap_vx_object_arr((vx_object_array)h264_graph_obj_ptr->enc_pool.arr[h264_graph_obj_ptr->mosaic_enq_id], h264_graph_obj_ptr->enc_pool.map_id[h264_graph_obj_ptr->mosaic_enq_id], h264_graph_obj_ptr->num_ch);
    #ifdef TEST_ENC_STATIC_MOSAIC_OUTPUT
                unmap_vx_image(test_img[h264_graph_obj_ptr->mosaic_enq_id], h264_graph_obj_ptr->enc_pool.map_id[h264_graph_obj_ptr->mosaic_enq_id]);
    #endif
                h264_graph_obj_ptr->mosaic_enq_id++;
                h264_graph_obj_ptr->mosaic_enq_id = ((h264_graph_obj_ptr->mosaic_enq_id >= h264_graph_obj_ptr->enc_pool.bufq_depth) ? 0 : h264_graph_obj_ptr->mosaic_enq_id);
            }
            if (h264_graph_obj_ptr->encode == 1)
            {
                printf("Pushing EoS to Codec Pipeline.\n");
                status = appCodecEnqEosAppSrc();
            }
    
            if (h264_graph_obj_ptr->encode == 1)
            {
                appCodecStop();
                printf("appCodecStop Done!\n");
            }

    gstreamer Command string:

    ############## Codec srcType =0 sinkType =1
    
    appsrc format=GST_FORMAT_TIME is-live=true do-timestamp=true block=true name=myAppSrc ! queue
    ! video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, format=(string)NV12, interlace-mode=(string)progressive, colorimetry=(string)bt601
    ! v4l2h264enc extra-controls="controls, frame_level_rate_control_enable=1, video_bitrate=10000000"
    ! h264parse
    ! mp4mux
    ! filesink location=output_video_0.mp4

     

    Mosaic five window config for inputs:

     vx_int32 idx = 0;
    
       imgMosaicObj->out_width = MOSAIC_OUT_WIDTH;
       imgMosaicObj->out_height = MOSAIC_OUT_HEIGHT;
    
       tivxImgMosaicParamsSetDefaults(&imgMosaicObj->params);
    
       /* cl4 window1 setup */
       imgMosaicObj->params.windows[idx].startX = 0;
       imgMosaicObj->params.windows[idx].startY = 0;
       imgMosaicObj->params.windows[idx].width = imgMosaicObj->out_width/3;
       imgMosaicObj->params.windows[idx].height =  imgMosaicObj->out_height/2;
       imgMosaicObj->params.windows[idx].input_select = 0;
       imgMosaicObj->params.windows[idx].channel_select = 0;
       idx++;
    #if 1
       /* cl2 window2 setup */
       imgMosaicObj->params.windows[idx].startX =imgMosaicObj->out_width /3;
       imgMosaicObj->params.windows[idx].startY = 0;
       imgMosaicObj->params.windows[idx].width = imgMosaicObj->out_width /3;
       imgMosaicObj->params.windows[idx].height =imgMosaicObj->out_height/2;
       imgMosaicObj->params.windows[idx].input_select = 1;
       imgMosaicObj->params.windows[idx].channel_select = 0;
       idx++;
       /* cl5 window3 setup */
       imgMosaicObj->params.windows[idx].startX = (imgMosaicObj->out_width /3)*2 ;
       imgMosaicObj->params.windows[idx].startY = 0;                                                    
       imgMosaicObj->params.windows[idx].width = imgMosaicObj->out_width /3;
       imgMosaicObj->params.windows[idx].height = imgMosaicObj->out_height/2;
       imgMosaicObj->params.windows[idx].input_select = 2;
       imgMosaicObj->params.windows[idx].channel_select = 0;
       idx++;
       
       /* cl5 window4 setup */
       imgMosaicObj->params.windows[idx].startX = 0 ;
       imgMosaicObj->params.windows[idx].startY = imgMosaicObj->out_height/2;                                                    
       imgMosaicObj->params.windows[idx].width = imgMosaicObj->out_width/3;
       imgMosaicObj->params.windows[idx].height = imgMosaicObj->out_height/2; 
       imgMosaicObj->params.windows[idx].input_select = 3;
       imgMosaicObj->params.windows[idx].channel_select = 0;
       idx++;
    
        /* cl6 window5 setup */
        imgMosaicObj->params.windows[idx].startX = (imgMosaicObj->out_width /3);
        imgMosaicObj->params.windows[idx].startY =imgMosaicObj->out_height/2;                                             
        imgMosaicObj->params.windows[idx].width = imgMosaicObj->out_width/3;
        imgMosaicObj->params.windows[idx].height = imgMosaicObj->out_height/2;
        imgMosaicObj->params.windows[idx].input_select = 4 ;
        imgMosaicObj->params.windows[idx].channel_select = 0;
        idx++;
    #endif
       imgMosaicObj->params.num_windows = idx;
       imgMosaicObj->num_inputs = idx;
       /* Number of time to clear the output buffer before it gets reused */
       imgMosaicObj->params.clear_count = 6;
       imgMosaicObj->params.num_msc_instances = 1;
    
       /* Use MSC1 instance as MSC0 is used for scaler */
       imgMosaicObj->params.msc_instance = 1;

    Output Video:

    The video should be completely black but in between some green color frames are there. Not sure where is the issue, kindly suggest.

    Thanks and regards,

    Lalit

  • Attaching correct frame loop enq deq snippet, previous one is missing the mosaic output dequeue call:

    for (i = 0; i < 500; i++)
            {
                if (i >= h264_graph_obj_ptr->enc_pool.bufq_depth)
                {
    #if 1
                    appPerfPointBegin(&h264_graph_obj_ptr->graph_perf);
                    start_time = tivxPlatformGetTimeInUsecs();
                    /* Dequeue one image buffer */
                    if (status != VX_SUCCESS)
                    {
                        GEN3_DBG_PRINTF_ERR("ERROR: graph dequeue failed for capt cl2 cl4 out arr[%d], status = %d\n", buf_id, status);
                    }
    #endif
                    status = vxGraphParameterDequeueDoneRef(h264_graph_obj_ptr->graph,
                                                        h264_graph_obj_ptr->h264_imgMosaicObj.graph_parameter_index,
                                                        (vx_reference *)&image_temp, 1, &num_refs);
                    appPerfPointEnd(&h264_graph_obj_ptr->graph_perf);
                    if (i % 50 == 0)
                    {
                        appPerfPointPrintFPS(&h264_graph_obj_ptr->graph_perf);
                        appPerfPointReset(&h264_graph_obj_ptr->graph_perf);
                        printf("mosaic time taken %lu\n", tivxPlatformGetTimeInUsecs() - start_time);
                    }
                    if (i >= h264_graph_obj_ptr->enc_pool.bufq_depth)
                    {
                        status = appCodecDeqAppSrc(h264_graph_obj_ptr->mosaic_enq_id);
                        if (status != VX_SUCCESS)
                        {
                            printf("App Codec DEqueue App src is  FAILED!\n");
                        }
                        else
                        {
                            status = unmap_vx_object_arr((vx_object_array)h264_graph_obj_ptr->enc_pool.arr[h264_graph_obj_ptr->mosaic_enq_id],
                                                         h264_graph_obj_ptr->enc_pool.map_id[h264_graph_obj_ptr->mosaic_enq_id], h264_graph_obj_ptr->num_ch);
    #ifdef TEST_ENC_STATIC_MOSAIC_OUTPUT
                            status = unmap_vx_image(test_img[h264_graph_obj_ptr->mosaic_enq_id],
                                                    h264_graph_obj_ptr->enc_pool.map_id[h264_graph_obj_ptr->mosaic_enq_id]);
    #endif
                        }
                    }
                }
    #if 1
                /*output*/
                status = vxGraphParameterEnqueueReadyRef(h264_graph_obj_ptr->graph,
                                                         5,
                                                         (vx_reference *)&h264_graph_obj_ptr->h264_imgMosaicObj.output_image[h264_graph_obj_ptr->mosaic_enq_id], // lm: changed [buf_id] -> [h264_graph_obj_ptr->mosaic_enq_id]
                                                         1);
                if (status != VX_SUCCESS)
                {
                    GEN3_DBG_PRINTF_ERR("ERROR: graph enqueue failed for capt cl2 cl4 out arr[%d], status = %d\n", buf_id, status);
                }
                {
                    status = map_vx_object_arr((vx_object_array)h264_graph_obj_ptr->enc_pool.arr[h264_graph_obj_ptr->appsrc_push_id],
                                               h264_graph_obj_ptr->enc_pool.data_ptr[h264_graph_obj_ptr->appsrc_push_id],
                                               h264_graph_obj_ptr->enc_pool.map_id[h264_graph_obj_ptr->appsrc_push_id], h264_graph_obj_ptr->num_ch);
    #ifdef TEST_ENC_STATIC_MOSAIC_OUTPUT
                    status = map_vx_image(test_img[h264_graph_obj_ptr->appsrc_push_id],
                                          h264_graph_obj_ptr->enc_pool.data_ptr[h264_graph_obj_ptr->appsrc_push_id],
                                          h264_graph_obj_ptr->enc_pool.map_id[h264_graph_obj_ptr->appsrc_push_id]);
    #endif
                }
                if (status == VX_SUCCESS)
                {
                    status = appCodecEnqAppSrc(h264_graph_obj_ptr->mosaic_enq_id);
                    {
                        if (status != VX_SUCCESS)
                        {
                            printf("pp Codec Enqueue App src is  FAILED!\n");
                        }
                    }
                }
                h264_graph_obj_ptr->appsrc_push_id++;
                h264_graph_obj_ptr->appsrc_push_id = (h264_graph_obj_ptr->appsrc_push_id >= h264_graph_obj_ptr->enc_pool.bufq_depth) ? 0 : h264_graph_obj_ptr->appsrc_push_id;
                h264_graph_obj_ptr->mosaic_enq_id++;
                h264_graph_obj_ptr->mosaic_enq_id = (h264_graph_obj_ptr->mosaic_enq_id >= h264_graph_obj_ptr->enc_pool.bufq_depth) ? 0 : h264_graph_obj_ptr->mosaic_enq_id;
    
            } // End of for loop
            first_file = 0;
            for (i = 0; i < h264_graph_obj_ptr->enc_pool.bufq_depth; i++)
            {
    #if 1
                status = vxGraphParameterDequeueDoneRef(h264_graph_obj_ptr->graph,
                                                        h264_graph_obj_ptr->h264_imgMosaicObj.graph_parameter_index,
                                                        (vx_reference *)&image_temp, 1, &num_refs);
                if (status != VX_SUCCESS)
                {
                    GEN3_DBG_PRINTF_ERR("ERROR: graph dequeue failed for capt cl2 cl4 out arr[%d], status = %d\n", buf_id, status);
                }
                value++;
    #endif
                status = appCodecDeqAppSrc(h264_graph_obj_ptr->mosaic_enq_id);
                unmap_vx_object_arr((vx_object_array)h264_graph_obj_ptr->enc_pool.arr[h264_graph_obj_ptr->mosaic_enq_id], h264_graph_obj_ptr->enc_pool.map_id[h264_graph_obj_ptr->mosaic_enq_id], h264_graph_obj_ptr->num_ch);
    #ifdef TEST_ENC_STATIC_MOSAIC_OUTPUT
                unmap_vx_image(test_img[h264_graph_obj_ptr->mosaic_enq_id], h264_graph_obj_ptr->enc_pool.map_id[h264_graph_obj_ptr->mosaic_enq_id]);
    #endif
                h264_graph_obj_ptr->mosaic_enq_id++;
                h264_graph_obj_ptr->mosaic_enq_id = ((h264_graph_obj_ptr->mosaic_enq_id >= h264_graph_obj_ptr->enc_pool.bufq_depth) ? 0 : h264_graph_obj_ptr->mosaic_enq_id);
            }
            if (h264_graph_obj_ptr->encode == 1)
            {
                printf("Pushing EoS to Codec Pipeline.\n");
                status = appCodecEnqEosAppSrc();
            }
    
            if (h264_graph_obj_ptr->encode == 1)
            {
                appCodecStop();
                printf("appCodecStop Done!\n");
            }
            i = 0;
            outer_loop++;
            for (buf_id = 0; buf_id < h264_graph_obj_ptr->enc_pool.bufq_depth; buf_id++)
            {
    #if defined(GEN3_APP_DEBUG_ENABLE)
                each_frame_processing_time = tivxPlatformGetTimeInUsecs();
    #endif
            } // for

    thanks,

    Lalit

  • Hello Lalit, 

    I am out of office until the 23rd, but will take a look at this as I become available. I looped in our Vision Apps expert to provide some insights on if there are any particular settings you should have for the mosaic output.

    Is the default multi-cam-codec demo working with the mosaic output (w/o your changes)?

    Thank you,
    Sarabesh S.

  • Hi Sarabesh,

    Yes, I did not try the default multi-cam with mosaic but I did a small experiment to test the encoder in my app. Instead of passing mosaic output images I created some static images and mapped them directly to the encoder input, removing the mosaic node completely. This time the output video was not having garbage frames. Upon further debugging I found that the there may be some issue with my map_vx_object_array() function.

    Attaching map function definition below:

    static vx_status map_vx_object_arr(vx_object_array in_arr, void *data_ptr[CODEC_MAX_NUM_CHANNELS][CODEC_MAX_NUM_PLANES], vx_map_id map_id[CODEC_MAX_NUM_CHANNELS][CODEC_MAX_NUM_PLANES], vx_int32 num_channels)
    {
        vx_status status;
        vx_int32 ch;
    
        status = vxGetStatus((vx_reference)in_arr);
    
        for (ch = 0; status == VX_SUCCESS && ch < num_channels; ch++)
        {
            vx_rectangle_t rect;
            vx_imagepatch_addressing_t image_addr;
    
            vx_uint32 img_width;
            vx_uint32 img_height;
    
            vx_image in_img = (vx_image)vxGetObjectArrayItem(in_arr, ch);
    
            vxQueryImage(in_img, VX_IMAGE_WIDTH, &img_width, sizeof(vx_uint32));
            vxQueryImage(in_img, VX_IMAGE_HEIGHT, &img_height, sizeof(vx_uint32));
    
            rect.start_x = 0;
            rect.start_y = 0;
            rect.end_x = img_width;
            rect.end_y = img_height;
            // printf("(mapping arr)img_width: %d, img_height: %d\n", img_width, img_height);
            /* MAP Luma */
            status = vxMapImagePatch(in_img,
                                     &rect,
                                     0,
                                     &map_id[ch][0],
                                     &image_addr,
                                     &data_ptr[ch][0],
                                     VX_READ_ONLY,
                                     VX_MEMORY_TYPE_HOST,
                                     VX_NOGAP_X);
            if (status != VX_SUCCESS)
            {
                printf("map_obj_arr(): vxMap unsuccessful");
                return (status);
            }
            // printf("img_addr.h: %d, img_addr.w: %d, img_addr.strd_y:%d\n", image_addr.dim_x, image_addr.dim_y, image_addr.stride_y);
            // static int file_count = 1;
            // char name[100];
            // snprintf(name, 100, "/opt/vision_apps/test_data/mosaic_out_file_%d.yuv", file_count++);
            // FILE *fp = fopen(name, "wb");
            // if (!fp)
            // {
            //     printf("Error: Unable to open file %s\n", name);
            //     return -1;
            // }
            // uint8_t *y_data_ptr = data_ptr[ch][0];
            // for (int i = 0; i < img_height; i++)
            // {
            //     fwrite(y_data_ptr, 1, image_addr.stride_y, fp);
            //     y_data_ptr += image_addr.stride_y;
            // }
            // fflush(fp);
            // printf("dumped plane 1\n");
            /* Map CbCr */
            status = vxMapImagePatch(in_img,
                                     &rect,
                                     1,
                                     &map_id[ch][1],
                                     &image_addr,
                                     &data_ptr[ch][1],
                                     VX_READ_ONLY,
                                     VX_MEMORY_TYPE_HOST,
                                     VX_NOGAP_X);
            if (status != VX_SUCCESS)
            {
                printf("copy_image(): vxMap unsuccessful");
                return (status);
            }
            // uint8_t *uv_data_ptr = data_ptr[ch][1];
            // for (int i = 0; i < img_height/2; i++)
            // {
            //     fwrite(uv_data_ptr, 1, img_width, fp);
            //     uv_data_ptr += image_addr.stride_y;
            // }
            // fflush(fp);
            // printf("dumped plane 2\n");
            // fclose(fp);
            // vxReleaseReference((vx_reference *)&in_img);
        }
        return (status);
    }

    After mapping I am trying to save the images using the data_ptr (see the commented code) for both planes. The saved images are not proper. They have lot of garbage pixels in-between. I found that two images are correct then next four images are not correct then again two images are correct and so on. I am doubting it may be related to buffers but not able to debug.

    Whenever you are available pls have a look at this. Let me know what you need from my side.

    Thanks and regards,

    Lalit

  • Hi Lalit,

    For configuring mosaic window can you refer to set_img_mosaic_params in mult_cam_ demo. Where the in_width, in_height is calculated using appIssGetResizeParams()  so that the values are aligned to 8x.

    Regards,
    Gokul

  • Hi Gokul,

    I tried this but still it is not working.

    regards,

    Lalit

  • Hi Lalit,

    Can you connect the output of mosaic to display node instead of encoder, and see if the issue persists.

    Regards,
    Gokul

  • Hi Gokul, I can try this but fyi I saved all 500 mosaic dequeued buffer data and all the saved images are correct.

    Is testing with display meaningful in this case ? 

    regards,

    Lalit

  • Hi Lalit,

    It is not required in that case as the mosaic's output is correct. I am suspecting the enqueue/dequeue part of the buffers, both mosaic and codec uses the buffers same time and somewhere corrupting it. You can also check the multi_cam codec demo as reference for enqueue/dequeue part.

    Regards,
    Gokul

  • Hi,

    I have debugged the issue and putting the changes here.

    Changed the ref list from mosaic output to enc pool arr.

    Before:

    idx = 0;
        status = add_graph_parameter_by_node_index(h264_graph_obj_ptr->graph, h264_graph_obj_ptr->h264_imgMosaicObj.node, 1); // lm: idx 0 -> 1 using output image not config
        /* Set graph schedule config such that graph parameter @index is enqueuable */
        h264_graph_obj_ptr->h264_imgMosaicObj.graph_parameter_index = idx;
        graph_parameters_queue_params_list[idx].graph_parameter_index = idx;
        graph_parameters_queue_params_list[idx].refs_list_size = h264_graph_obj_ptr->enc_pool.bufq_depth; // lm: changed from GRAPH_PRM_REF_LIST_SIZE(=4) to enc_pool.bufq_depth(=6)
        graph_parameters_queue_params_list[idx].refs_list = (vx_reference *)&h264_graph_obj_ptr->h264_imgMosaicObj.output_image[0];
        idx++;

    After:

    idx = 0;
        status = add_graph_parameter_by_node_index(h264_graph_obj_ptr->graph, h264_graph_obj_ptr->h264_imgMosaicObj.node, 1); // lm: idx 0 -> 1 using output image not config
        /* Set graph schedule config such that graph parameter @index is enqueuable */
        h264_graph_obj_ptr->h264_imgMosaicObj.graph_parameter_index = idx;
        graph_parameters_queue_params_list[idx].graph_parameter_index = idx;
        graph_parameters_queue_params_list[idx].refs_list_size = h264_graph_obj_ptr->enc_pool.bufq_depth; // lm: changed from GRAPH_PRM_REF_LIST_SIZE(=4) to enc_pool.bufq_depth(=6)
        graph_parameters_queue_params_list[idx].refs_list = (vx_reference *)&h264_graph_obj_ptr->enc_pool.arr[0];
        idx++;
    

    Accordingly change this in enqueue call:

    Before:

     /*output*/
                status = vxGraphParameterEnqueueReadyRef(h264_graph_obj_ptr->graph,
                                                         5,
                                                         (vx_reference *)&h264_graph_obj_ptr->h264_imgMosaicObj.output_image[h264_graph_obj_ptr->mosaic_enq_id], // lm: changed [buf_id] -> [h264_graph_obj_ptr->mosaic_enq_id]
                                                         1);

    After:

    status = vxGraphParameterEnqueueReadyRef(h264_graph_obj_ptr->graph,
                                                         0,
                                                         (vx_reference *)&h264_graph_obj_ptr->enc_pool.arr[h264_graph_obj_ptr->mosaic_enq_id], // lm: changed [buf_id] -> [h264_graph_obj_ptr->mosaic_enq_id]
                                                         1);

    I referred to the multi cam codec app and did these changes. I am confused as how this is different than the previous implementation. In the previous implementation I had made mosaic output image array as ref list and enqueueing dequeuing that. The array is having references of enc pool array itself(see 'linking mosaic output and encoder input' snippet). Then how these both implementation not same? Need clarity on this.

    Thanks and Regards,

    Lalit 

  • Hi Lalit,

    I was referring to the enqueue/dequeue of buffers part.

    What I am suspecting is both mosaic and encoder is using the same buffer at some instances, so it is getting corrupted.

    I referred to the multi cam codec app and did these changes. I am confused as how this is different than the previous implementation. In the previous implementation I had made mosaic output image array as ref list and enqueueing dequeuing that. The array is having references of enc pool array itself(see 'linking mosaic output and encoder input' snippet). Then how these both implementation not same? Need clarity on this.

    Graph creation part seems to be correct as you are getting the mosaic output to the encoder.

    Regards,
    Gokul