This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TDA4VM: multi_cam_codec demo debug

Part Number: TDA4VM

Hi, team

      we are debugging the multi_cam_codec  demo and  successfully run it.  Now we want to add a API to get h264 stream and push it to our application. I need your help on how to get it. In our  opinion, the h264 stream can get after encode in API decode_display in /ti-processor-sdk-rtos-j721e-evm-08_06_00_12/vision_apps/apps/basic_demos/app_multi_cam_codec/main.c. Is that right?

static vx_status decode_display(AppObj* obj, vx_int32 frame_id)
{
    APP_PRINTF("\ndecode_display: frame %d beginning\n", frame_id);
    vx_status status = VX_SUCCESS;

    TIOVXImgMosaicModuleObj *imgMosaicObj = &obj->imgMosaicObj;
    AppGraphParamRefPool *dec_pool = &obj->dec_pool;

    /* checksum_actual is the checksum determined by the realtime test
        checksum_expected is the checksum that is expected to be the pipeline output */
    uint32_t checksum_actual = 0;

    /* This is the number of frames required for the pipeline AWB and AE algorithms to stabilize
        (note that 15 is only required for the 6-8 camera use cases - others converge quicker) */
    uint8_t stability_frame = 15;

    vx_object_array mosaic_input_arr;
    vx_image mosaic_output_image;
    uint32_t num_refs;

    int8_t pull_status = -2;


    if (status == VX_SUCCESS && obj->decode==1)
    {
        pull_status = appCodecDeqAppSink(obj->appsink_pull_id);
        if (pull_status == 1)
        {
            obj->EOS=1;
            obj->stop_task=1;
            APP_PRINTF("\nCODEC=> EOS Recieved\n");
            goto exit;
        }
        else if (pull_status != 0)
        {
            goto exit;
        }
    }

    if ( frame_id >= dec_pool->bufq_depth )
    {
        if (status == VX_SUCCESS)
        {
            status = vxGraphParameterDequeueDoneRef(obj->display_graph, imgMosaicObj->inputs[0].graph_parameter_index, (vx_reference*)&mosaic_input_arr, 1, &num_refs);
        }
        /************************************
        ****add  code to get h264 stream*****
        
        *************************************/
        if((obj->en_out_img_write == 1) || (obj->test_mode == 1))
        {
            vx_char output_file_name[APP_MAX_FILE_PATH];

            /* Dequeue output */
            if (status == VX_SUCCESS)
            {
                status = vxGraphParameterDequeueDoneRef(obj->display_graph, imgMosaicObj->output_graph_parameter_index, (vx_reference*)&mosaic_output_image, 1, &num_refs);
            }
            if ((status == VX_SUCCESS) && (obj->test_mode == 1) && (frame_id > TEST_BUFFER))
            {
                /* calculate the checksum of the mosaic output */

                if ((app_test_check_image(mosaic_output_image, checksums_expected[obj->sensorObj.sensor_index][obj->sensorObj.num_cameras_enabled-1],
                                        &checksum_actual) != vx_true_e) && (frame_id > stability_frame))
                {
                    test_result = vx_false_e;
                    /* in case test fails and needs to change */
                    populate_gatherer(obj->sensorObj.sensor_index, obj->sensorObj.num_cameras_enabled-1, checksum_actual);
                }
            }

            if (obj->en_out_img_write == 1) {
                appPerfPointBegin(&obj->fileio_perf);
                snprintf(output_file_name, APP_MAX_FILE_PATH, "%s/mosaic_output_%010d_%dx%d.yuv", obj->output_file_path, (frame_id - APP_BUFFER_Q_DEPTH), imgMosaicObj->out_width, imgMosaicObj->out_height);
                if (status == VX_SUCCESS)
                {
                    /* TODO: Correct checksums not added yet.
                     * status = writeMosaicOutput(output_file_name, mosaic_output_image);
                     */
                     status = writeMosaicOutput(output_file_name, mosaic_output_image);
                }
                appPerfPointEnd(&obj->fileio_perf);
            }
        }
    }

    if(status==VX_SUCCESS && obj->decode==1)
    {
        status = assign_array_image_buffers(
                        dec_pool->arr[obj->mosaic_enq_id], 
                        dec_pool->data_ptr[obj->appsink_pull_id],
                        dec_pool->plane_sizes);
    }

    if((obj->en_out_img_write == 1) || (obj->test_mode == 1))
    {
        if (status == VX_SUCCESS)
        {
            status = vxGraphParameterEnqueueReadyRef(obj->display_graph, imgMosaicObj->output_graph_parameter_index, (vx_reference*)&imgMosaicObj->output_image[obj->display_id], 1);
        }
    }
    if (status == VX_SUCCESS)
    {
        status = vxGraphParameterEnqueueReadyRef(obj->display_graph, imgMosaicObj->inputs[0].graph_parameter_index, (vx_reference*)&dec_pool->arr[obj->mosaic_enq_id], 1);
    }

    obj->display_id++;
    obj->display_id         = (obj->display_id  >= APP_BUFFER_Q_DEPTH)? 0 : obj->display_id;
    obj->mosaic_enq_id++;
    obj->mosaic_enq_id      = (obj->mosaic_enq_id  >= dec_pool->bufq_depth)? 0 : obj->mosaic_enq_id;
    obj->appsink_pull_id++;
    obj->appsink_pull_id    = (obj->appsink_pull_id  >= obj->num_codec_bufs)? 0 : obj->appsink_pull_id;

exit:
    if (obj->decode==1)
    {
        appCodecEnqAppSink(obj->appsink_pull_id);
    }

    return status;
}

1. after "status = vxGraphParameterDequeueDoneRef(obj->display_graph, imgMosaicObj->inputs[0].graph_parameter_index, (vx_reference*)&mosaic_input_arr, 1, &num_refs);"

if we can get h264 stream from mosaic_input_arr?

2. how to get stream from vx_reference?

Thinks

  • Hi Expert:

           Can you  give us some advices. Thinks

  • Hi Davied,

    Sorry for the delay in the response.

    The application currently only could get the buffers that are going to be encoded (i.e., before pushing the buffer into gstreamer) and buffers that are decoded (i.e. getting the decoded buffers from gstreamer).

    Currently, in the encode only option, we save the encoded format in a mp4 container (output_video_0.mp4).
    You could find this gstreamer command in the API construct_gst_strings() [srcType = 0, sinkType = 1]

    You would have to modify this from a file sink to an Appsink and get the buffers from the gstreamer.

    This would require change in the application from your end.

    Regards,

    Nikhil