This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

PROCESSOR-SDK-J721E: 16 bit Data

Part Number: PROCESSOR-SDK-J721E

In the single camera application data flow, sensor raw data is converted into YUV 8 bit data.  It is highlighted in the below diagram.

1. In our application, we want YUV 16 bit data.  What are the changes that we have to do in the source code(single camera application)?

2. In the single camera application flow, it is mentioned that the YUV is passed to the DSS.  But, there is no code related to DSS in app_single_cam_main.c file.  How the image will be displayed while executing the single camera application?

  • Hi Muralidharan,

    ISP can output either 12bit per component or 8bit per component for YUV data, it cannot output 16bit per component for YUV. 

    Yes, ISP output is displayed via DSS. For display code, search for display_params code in app_create_graph API.

    Regards,

    Brijesh

  • But our camera gives 16 bit data.  Is there any possibility to change the code(single camera application) for 16 bit?

    Or what is the alternate way to do this?

  • But what is 16bit data? Is it RAW Bayer data?

  • What is a RAW bayer data?

    If the camera gives YUV data, how to show the video pipeline in the display?

    To display the YUV data in the display what are the changes that we have to do in the source code(app_single_cam_main.c file)?

  • Which release are you using? Can you use below link for SDK7.0/1? This patch adds support for YUV format in multi-cam example. 

    https://e2e.ti.com/support/processors/f/791/t/931988 

    If you are using later release, you could try setting flag yuv_cam_input in the single example to run it for YUV data.  This can be even done in multi-cam example by setting sensor_out_format flag to 1 in sensor framework. 

    Regards,

    Brijesh

  • We are using a thermal camera.  Output format is RAW8(monochrome).  As per our understanding, DSS gets RGB or YUV data as input.  VISS is doing the conversion from RAW to YUV.  In VISS, what are the changes that we have to do in app_create_graph function? And Whether CSI-2 support monochrome?  If yes, whether DSS accepts the monochrome data to display?

  • ISS capture process this data and display can also display it as greyscale image, it is supported in the DSS Node.

    Regards,

    Brijesh

  • In the dataflow, can we send the RAW8(monochrome) data to DSS to display directly without VISS, LDC and MSC blocks?

    If yes, what changes have to be done in single camera application source code?

  • Hi,

    You could refer to ti-processor-sdk-rtos-j721e-evm-07_01_00_11\tiovx\kernels_j7\hwa\test\test_capture_display.c. This example exactly does what you required, ie connects VISS output directly to display and displays RAW12 data as greyscale image. Please refer to it and make similar changes in your example.

    Regards,

    Brijesh

  • Hi Brijesh,

              We referred ti-processor-sdk-rtos-j721e-evm-07_01_00_11\tiovx\kernels_j7\hwa\test\test_capture_display.c file and changed the single camera application code.  Please see the modified code.

    --------------------------------------------------------------------------------------------------------------------------------------------------------

    obj->graph = vxCreateGraph(obj->context);
    if(status == VX_SUCCESS)
    {
    status = vxGetStatus((vx_reference) obj->graph);
    }
    APP_ASSERT(vx_true_e == tivxIsTargetEnabled(TIVX_TARGET_VPAC_VISS1));

    APP_PRINTF("Initializing params for capture node \n");

    /* Setting to num buf of capture node */
    APP_PRINTF("Initializing params for capture node \n");
    obj->num_cap_buf = NUM_BUFS;//4

    if(vx_false_e == yuv_cam_input)
    {
    raw_image = tivxCreateRawImage(obj->context, &sensorParams.sensorInfo.raw_params);

    /* allocate Input and Output refs, multiple refs created to allow pipelining of graph */
    for(buf_id=0; buf_id<obj->num_cap_buf; buf_id++)
    {
    if(status == VX_SUCCESS)
    {
    obj->cap_frames[buf_id] = vxCreateObjectArray(obj->context, (vx_reference)raw_image, num_capture_frames);
    status = vxGetStatus((vx_reference) obj->cap_frames[buf_id]);
    }
    }
    if(status == VX_SUCCESS)
    {
    status = tivxReleaseRawImage(&raw_image);
    }
    }

    /* Config initialization */
    tivx_capture_params_init(&local_capture_config);
    local_capture_config.numInst = 1U;
    local_capture_config.numCh = 1U;
    local_capture_config.instId[CAPT_INST_ID] = CAPT_INST_ID;
    local_capture_config.instCfg[CAPT_INST_ID].enableCsiv2p0Support = (uint32_t)vx_true_e;
    local_capture_config.instCfg[CAPT_INST_ID].numDataLanes =
    sensorParams.sensorInfo.numDataLanes;

    local_capture_config.timeout = 33;
    local_capture_config.timeoutInitial = 500;


    APP_PRINTF("local_capture_config.numDataLanes = %d \n", local_capture_config.instCfg[CAPT_INST_ID].numDataLanes);

    for (loopCnt = 0U ;
    loopCnt < local_capture_config.instCfg[CAPT_INST_ID].numDataLanes ;
    loopCnt++)
    {
    local_capture_config.instCfg[CAPT_INST_ID].dataLanesMap[loopCnt] = sensorParams.sensorInfo.dataLanesMap[loopCnt];
    APP_PRINTF("local_capture_config.dataLanesMap[%d] = %d \n",
    loopCnt,
    local_capture_config.instCfg[CAPT_INST_ID].dataLanesMap[loopCnt]);
    }

    capture_config = vxCreateUserDataObject(obj->context, capture_user_data_object_name, sizeof(tivx_capture_params_t), &local_capture_config);

    APP_PRINTF("capture_config = 0x%p \n", capture_config);

    APP_PRINTF("Creating capture node \n");

    obj->capture_node = tivxCaptureNode(obj->graph, capture_config, obj->cap_frames[0]);

    APP_PRINTF("obj->capture_node = 0x%p \n", obj->capture_node);

    if(status == VX_SUCCESS)
    {
    status = vxReleaseUserDataObject(&capture_config);
    }
    if(status == VX_SUCCESS)
    {
    status = vxSetNodeTarget(obj->capture_node, VX_TARGET_STRING, TIVX_TARGET_CAPTURE1);

    }
    /*
    if(vx_false_e == yuv_cam_input)
    {
    obj->raw = (tivx_raw_image)vxGetObjectArrayItem(obj->cap_frames[0], 0);

    }

    obj->display_image = (vx_image)obj->raw;
    obj->display_params.posX = (1920U - obj->display_params.outWidth)/2;
    obj->display_params.posY = (1080U - obj->display_params.outHeight)/2;
    */
    obj->display_param_obj = vxCreateUserDataObject(obj->context, "tivx_display_params_t", sizeof(tivx_display_params_t), &obj->display_params);

    obj->display_image = (vx_image)vxGetObjectArrayItem(obj->cap_frames[0], 0);
    obj->displayNode = tivxDisplayNode(obj->graph, obj->display_param_obj, obj->display_image);
    if(status == VX_SUCCESS)
    {
    status = vxSetNodeTarget(obj->displayNode, VX_TARGET_STRING, TIVX_TARGET_DISPLAY1);
    APP_PRINTF("Display Set Target done\n");
    }

    int graph_parameter_num = 0;

    /* input @ node index 1, becomes graph parameter 0 */
    add_graph_parameter_by_node_index(obj->graph, obj->capture_node, 1);

    /* set graph schedule config such that graph parameter @ index 0 is enqueuable */
    graph_parameters_queue_params_list[graph_parameter_num].graph_parameter_index = graph_parameter_num;
    graph_parameters_queue_params_list[graph_parameter_num].refs_list_size = obj->num_cap_buf;
    graph_parameters_queue_params_list[graph_parameter_num].refs_list = (vx_reference*)&(obj->cap_frames[0]);

    if(status == VX_SUCCESS)
    {
    status = tivxSetGraphPipelineDepth(obj->graph, obj->num_cap_buf);
    }

    /* Schedule mode auto is used, here we dont need to call vxScheduleGraph
    * Graph gets scheduled automatically as refs are enqueued to it
    */
    if(status == VX_SUCCESS)
    {
    status = vxSetGraphScheduleConfig(obj->graph,
    VX_GRAPH_SCHEDULE_MODE_QUEUE_AUTO,
    params_list_depth,
    graph_parameters_queue_params_list
    );
    }
    APP_PRINTF("vxSetGraphScheduleConfig done\n");

    if(status == VX_SUCCESS)
    {
    status = vxVerifyGraph(obj->graph);
    }


    if(status == VX_SUCCESS)
    {
    status = tivxExportGraphToDot(obj->graph, ".", "single_cam_graph");
    }


    APP_PRINTF("app_create_graph exiting\n");
    return status;

    ---------------------------------------------------------------------------------------------------------------------------------------------------------

    We connected the capture node output to display node. Can you please check the code and tell us whether it will display the RAW8 bit image in display? Or any other changes we have to do? 

  • Hi Muralidharan,

    raw image data format is not supported in the display node. can you please change this code to use vx_image with U8 data format? Capture node support vx_image with the U8 data format, in this case, it will configure CSIRX to capture in RAW8 format. This can be directly connected to the display node.

    Please refer to test_capture_display.c example. This example uses vx_image with U16 data type, you would just need to change data format to U8 and configure your sensor..

    Regards,

    Brijesh

  • Hi,

              We changed the data format to U8.  Error is resolved in creation of display node.  Now we are getting the error while executing below line of single camera application code in app_create_graph function.

    status = vxVerifyGraph(obj->graph);

    The error occurred is shown below.

    225.703947 s: VX_ZONE_ERROR:[tivxAddKernelCaptureValidate:163] image format is invalid
    225.703954 s: VX_ZONE_ERROR:[ownGraphNodeKernelValidate:531] node kernel validate failed for kernel com.ti.capture at index 0
    225.703960 s: VX_ZONE_ERROR:[vxVerifyGraph:1930] Node kernel Validate failed
    225.703966 s: VX_ZONE_ERROR:[vxVerifyGraph:2098] Graph verify failed

    1. Can you please check why verify graph is failing?

    ----------------------------------------------------------------------------------------------------------------------------------------------------------

    status = tivxGraphParameterEnqueueReadyRef(obj->graph, 0, (vx_reference*)&(obj->cap_frames[buf_id]), 1, TIVX_GRAPH_PARAMETER_ENQUEUE_FLAG_PIPEUP);

    While executing the above line in app_run_graph, we are getting one more error as shown below.

    309.248011 s:  VX_ZONE_ERROR:[tivxGraphParameterEnqueueReadyRef:251] Reference enqueue not supported at graph parameter index 0

    2. We don't know why this error is occuring and is this error because of display node failure or something else?

  • Hi Muralidharan,

    Can you please apply attached patch on the tiovx folder and try U8 format?

    /cfs-file/__key/communityserver-discussions-components-files/791/Added_5F00_U8_5F00_RAW8_5F00_Support_5F00_In_5F00_Capture.patch

    Regards,

    Brijesh

  • Hi,

              How to apply patch on the tiovx folder?  Can you share the linux command to apply this patch?

  • Goto tiovx folder and use below command.

    patch -p1 < Added_U8_RAW8_Support_In_Capture.patch 

  • Thanks Brijesh for your help to solve the errors.

    In app_run_graph,  the program is waiting in the below line of code.

    status = vxGraphParameterDequeueDoneRef(obj->graph, graph_parameter_num, (vx_reference*)&out_capture_frames, 1, &num_refs_capture);

    Also it shows one warning as shown below.

    [MCU2_0]    213.150701 s:  VX_ZONE_WARNING:[tivxCaptureSetTimeout:765]  CAPTURE: WARNING: Error frame not provided using tivxCaptureRegisterErrorFrame, defaulting to waiting forever !!!

    Can you help us to solve this issue?

  • That means capture is still not working. There is no frame getting captured.

    Have you brought up this sensor module before? 

  • Hi Brijesh,

             We configured deserializer, serializer and camera. By reading the registers in the deserializer, we verified that the camera is streaming the data to deserializer.  

    This is what you want or something else you want us to verify?

  • Hi Muralidharan,

    Could you please make sure that,

    1, lane speed configured in CSIRX is matching with deserializer output rate? 

    2, deserializer starts outputting data only after VerifyGraph is called

    3, the number of lanes, lane position is matching? 

    Regards,

    Brijesh

  • tivxCaptureRegisterErrorFrame waiting forever doesn't mean not working. I got this but still can see responsable output on display

  • Hi Keelung Yang,

    Waiting on on below statement means it is not capturing any frames, if it has, this call would have returned. 

    status = vxGraphParameterDequeueDoneRef(obj->graph, graph_parameter_num, (vx_reference*)&out_capture_frames, 1, &num_refs_capture);

    The print is coming because there is no error frame provided, so capture node will wait until real frame is captured. If the error frame is provided, then node will forward error frame after timeout. 

    Regards,

    Brijesh

  • Hi Brijesh,

              I have understood that Error frame check should be done before the app_run_graph. We need a few details regarding the same.

    1. What exactly is an error frame? Is it something we need to feed in any folder. (If so, please mention the path)

    2. Does this frame need to be captured explicitly from the camera (If so, can it be any format) Or Can we just download a .RAW image from interent and pass it to the Verifygraph?

    3. Is there any method to By-pass this error frame check?

    Please let us know.

  • Hi Muralidharan,

    Yes, error frame will be used when the camera stops outputting. If the error frame is not provided then the graph would stop.

    It can be any frame that you would like to be shown in case camera stops outputting. 

    Yes, error frame can be bypassed, graph can still run if error frame is not provided and if the camera is running consistently fine..

    Regards,

    Brijesh 

  • Hi Brijesh,

              While executing the below code which is part of app_create_graph function, 

    ---------------------------------------------------------------------------

    printf("Before tivxCreateRawImage\n");
    obj->fs_test_raw_image = tivxCreateRawImage(obj->context, &(sensorParams.sensorInfo.raw_params));
    printf("After tivxCreateRawImage\n");

    ------------------------------------------------------------------------------

    we are getting the following error message.

    ---------------------------------------

    Before tivxCreateRawImage
    326.733530 s: VX_ZONE_ERROR:[ownIsValidCreateParams:72] invalid pixel_container for exposure index 0
    After tivxCreateRawImage

    326.733551 s: VX_ZONE_ERROR:[tivxQueryRawImage:699] invalid image reference
    326.733556 s: VX_ZONE_ERROR:[tivxQueryRawImage:699] invalid image reference
    326.733561 s: VX_ZONE_ERROR:[tivxQueryRawImage:699] invalid image reference

    ---------------------------------------

    We have used a generic .jpg file and renamed as .raw file.  

    We need the following clarifications.

    1. Is the error because of test image(raw)?

    2. What is the procedure for creating .raw image?  Can generic .jpg file renamed as .raw file?

    3. Currently 'VX_DF_IMAGE_U8' data type is used in sensor params data structure in iss_sensor_<camera_name>.c file.  Is this data type correct?  If we change the data type to 'TIVX_RAW_IMAGE_8_BIT' in  tivxCreateRawImage API, will display node work?

    4. Does display node support directly RAW_IMAGE data type or we need to typecast to any other format?

  • Hi Muralidharan,

    Please find answers to your questions,

    #1, yes this error is coming due to raw image. Can you please share your raw image format, ie pixel size and msb position? This seems to be unsupported.

    #2, raw image is just binary format, not encoded. 

    #3, please dont use TIVX_RAW_IMAGE_8_BIT format in raw image. i think you are trying to capture 8bit raw data. In this case, can you try vx_image with VX_DF_IMAGE_U8 format?

    #4, no display node does not support raw image data format. 

    Regards,

    Brijesh