This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TDA4VM: Output position of 'encode node'

Part Number: TDA4VM


Hello, can you tell us about the output location of 'encode node'? 

Base:ti-processor-sdk-rtos-j721e-evm-08_06_00_12

Demo:app_srv_multi_cam_codec

Looking forward to your prompt reply!

Thank you very much!

  • Hi,

    Do you mean the ${PSDKRA}/vision_apps/apps/basic_demos/app_multi_cam_codec/ demo application?

    If yes, if you are doing only encode, then you should be looking for a file named output_video_0.mp4,

    The output location is mentioned in the function construct_gst_strings()  in the file ${PSDKRA}/vision_apps/apps/basic_demos/app_multi_cam_codec/main.c for
    sinkType == 1

    Regards,

    Nikhil

  • Yes, this location can encode the data of the LDC node into H264 format, but I don't want to save the encoded data stream locally. Instead, I want to find variables that represent the data stream and send them to remote devices in real-time through shared memory or other methods. How can I obtain the encoded H264 data stream?

  • Hi,

    The application currently only could get the buffers that are going to be encoded (i.e., before pushing the buffer into gstreamer) and buffers that are decoded (i.e. getting the decoded buffers from gstreamer).

    Currently, in the encode only option, we save the encoded format in a mp4 container (output_video_0.mp4).
    You could find this gstreamer command in the API construct_gst_strings() [srcType = 0, sinkType = 1]

    You would have to modify this from a file sink to an Appsink and get the buffers from the gstreamer.

    This would require change in the application from your end.

    Regards,

    Nikhil

  • Okay, but why is there some distortion in the "output_video0. mp4" obtained here, and the video perspective obtained is tilted to the left? For example, if I am standing in front of the camera, but in the "output_video0. mp4" screen, I am standing on the left side of the video screen. What is the reason for this? And compared to the right side, the left side will deform and become more severe.

  • We are using a 190 degree "FOV" fish eye camera

  • Hi,

    Could you please check if you are able to stream correctly from this sensor using the single cam application?

    May which sensor are you using? (i.e., resolution, fps, format etc.)

    At my end I'm using IMX390 which gives a raw 1920 x 1080 raw output and I'm able to get the output .mp4 file without distortion.

    Regards,

    Nikhil

  • OK,

    If I want to use an input with a resolution of "1280 * 720", how can I adjust the distortion issue?
    Fps=30
    Format is yuv422 uyvy

  • Hi,

    Could you please check if you are able to stream correctly from this sensor using the single cam application?

    Could you please confirm the above first, just to make sure that the sensor is being streamed correctly?

    Regards,

    Nikhil

  • It is able to stream correctly from this sensor using the single cam application

  • And it is also able to stream correctly from this sensor using the multi cam codec application

  • Hi,

    That is great.

    Could you please share me the application logs for multi cam codec application when you are doing encode only and saving it to a file?

    Regards,

    Nikhil

  • OK, this is the logs.

    root@j7-evm:/opt/vision_apps# ./run_app_multi_cam_codec.sh 
    test---------------------------main,start!
    APP: Init ... !!!
    MEM: Init ... !!!
    MEM: Initialized DMA HEAP (fd=4) !!!
    MEM: Init ... Done !!!
    IPC: Init ... !!!
    IPC: Init ... Done !!!
    REMOTE_SERVICE: Init ... !!!
    REMOTE_SERVICE: Init ... Done !!!
       160.518157 s: GTC Frequency = 200 MHz
    APP: Init ... Done !!!
       160.524674 s:  VX_ZONE_INIT:Enabled
       160.524703 s:  VX_ZONE_ERROR:Enabled
       160.524718 s:  VX_ZONE_WARNING:Enabled
       160.525511 s:  VX_ZONE_INIT:[tivxInitLocal:130] Initialization Done !!!
       160.527586 s:  VX_ZONE_INIT:[tivxHostInitLocal:93] Initialization Done for HOST !!!
       160.528595 s: ISS: Enumerating sensors ... !!!
       160.954164 s: ISS: Enumerating sensors ... found 0 : IMX390-UB953_D3
       160.954197 s: ISS: Enumerating sensors ... found 1 : AR0233-UB953_MARS
       160.954219 s: ISS: Enumerating sensors ... found 2 : AR0820-UB953_LI
       160.954236 s: ISS: Enumerating sensors ... found 3 : UB9xxx_RAW12_TESTPATTERN
       160.954252 s: ISS: Enumerating sensors ... found 4 : UB96x_UYVY_TESTPATTERN
       160.954267 s: ISS: Enumerating sensors ... found 5 : GW_AR0233_UYVY
       160.954282 s: ISS: Enumerating sensors ... found 6 : MAX96705_AR0147_UYVY
    Sensor selected : MAX96705_AR0147_UYVY
    Querying MAX96705_AR0147_UYVY 
       160.954329 s: ISS: Querying sensor [MAX96705_AR0147_UYVY] ... !!!
       160.954671 s: ISS: Querying sensor [MAX96705_AR0147_UYVY] ... Done !!!
       YUV Input selected. VISS, AEWB and Mosaic nodes will be bypassed. 
    Creating context done!
    Kernel loading done!
    160.970552 s: ISS: Initializing sensor [MAX96705_AR0147_UYVY], doing IM_SENSOR_CMD_PWRON ... !!!
       160.970897 s: ISS: Initializing sensor [MAX96705_AR0147_UYVY], doing IM_SENSOR_CMD_CONFIG ... !!!
    Sensor init done!
    LDC init done!
    Img Mosaic init done!
    Display init done!
    App Init Done!
    capture_graph create done!
    display_graph create done!
    Capture graph done!
    LDC graph done!
    Img Mosaic graph done!
    Display graph done!
    Pipeline params setup done!
    Codec Pipeline done!
    App Create Graph Done! 
    Capture Graph verify done!
    Display Graph verify done!
    appCodecSrcInit Done!
    App Verify Graph Done! 
    App Send Error Frame Done! 
    app_pipeline_params_defaults returned
    app_pipeline_params_defaults returned
    164.100165 s: ISS: Starting sensor [MAX96705_AR0147_UYVY] ... !!!
    164.484151 s: ISS: Starting sensor [MAX96705_AR0147_UYVY] ... !!!
    appStartImageSensor returned with status: 0
    appCodecStart Done!
    
    capture_encode: frame 0 beginning
    
    capture_encode: frame 1 beginning
    
    capture_encode: frame 2 beginning
    
    capture_encode: frame 3 beginning
    
    capture_encode: frame 4 beginning
    
    capture_encode: frame 5 beginning
    
    capture_encode: frame 6 beginning
    
    capture_encode: frame 7 beginning
    
    capture_encode: frame 8 beginning
    
    capture_encode: frame 9 beginning
    
    capture_encode: frame 10 beginning
    
    capture_encode: frame 11 beginning
    
    capture_encode: frame 12 beginning
    
    capture_encode: frame 13 beginning
    
    capture_encode: frame 14 beginning
    
    capture_encode: frame 15 beginning
    
    capture_encode: frame 16 beginning
    
    capture_encode: frame 17 beginning
    
    capture_encode: frame 18 beginning
    
    capture_encode: frame 19 beginning
    
    capture_encode: frame 20 beginning
    

  • Sorry for the delay in response..

    Below is the logs I have obtained at my end.

    root@j7-evm:/opt/vision_apps# ./run_app_multi_cam_codec.sh 
    APP: Init ... !!!
    MEM: Init ... !!!
    MEM: Initialized DMA HEAP (fd=4) !!!
    MEM: Init ... Done !!!
    IPC: Init ... !!!
    IPC: Init ... Done !!!
    REMOTE_SERVICE: Init ... !!!
    REMOTE_SERVICE: Init ... Done !!!
       117.838530 s: GTC Frequency = 200 MHz
    APP: Init ... Done !!!
       117.838611 s:  VX_ZONE_INIT:Enabled
       117.838619 s:  VX_ZONE_ERROR:Enabled
       117.838625 s:  VX_ZONE_WARNING:Enabled
       117.839263 s:  VX_ZONE_INIT:[tivxInitLocal:130] Initialization Done !!!
       117.840457 s:  VX_ZONE_INIT:[tivxHostInitLocal:93] Initialization Done for HOST !!!
       117.843796 s: ISS: Enumerating sensors ... !!!
       124.947200 s: ISS: Enumerating sensors ... found 0 : IMX390-UB953_D3
       124.947229 s: ISS: Enumerating sensors ... found 1 : AR0233-UB953_MARS
       124.947251 s: ISS: Enumerating sensors ... found 2 : AR0820-UB953_LI
       124.947256 s: ISS: Enumerating sensors ... found 3 : UB9xxx_RAW12_TESTPATTERN
       124.947262 s: ISS: Enumerating sensors ... found 4 : UB96x_UYVY_TESTPATTERN
       124.947267 s: ISS: Enumerating sensors ... found 5 : GW_AR0233_UYVY
    Sensor selected : IMX390-UB953_D3
    Querying IMX390-UB953_D3 
       124.947286 s: ISS: Querying sensor [IMX390-UB953_D3] ... !!!
       124.947614 s: ISS: Querying sensor [IMX390-UB953_D3] ... Done !!!
    Capture->Encode Selection Yes(1)/No(0)
    1
    Decode->Display Selection Yes(1)/No(0)
    0
    Max number of cameras supported by sensor IMX390-UB953_D3 = 12 
    Please enter number of channels to be enabled 
    1
       130.724409 s: ISS: Initializing sensor [IMX390-UB953_D3], doing IM_SENSOR_CMD_PWRON ... !!!
       130.724898 s: ISS: Initializing sensor [IMX390-UB953_D3], doing IM_SENSOR_CMD_CONFIG ... !!!
    [MCU2_0]    130.724696 s: IMX390_PowerOn : chId = 0x0 
    [MCU2_0]    131.882081 s:  Configuring IMX390 imager 0x40.. Please wait till it finishes 
       134.018086 s: ISS: Initializing sensor [IMX390-UB953_D3] ... Done !!!
    gst_wrapper: GstCmdString:
    appsrc format=GST_FORMAT_TIME is-live=true do-timestamp=true block=false name=myAppSrc0 ! queue 
    ! video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, format=(string)NV12, interlace-mode=(string)progressive, colorimetry=(string)bt601 
    ! v4l2h264enc bitrate=10000000 
    ! h264parse 
    ! mp4mux 
    ! filesink location=output_video_0.mp4 
    
       134.411157 s: ISS: Starting sensor [IMX390-UB953_D3] ... !!!
    

    I do not see the below gstreamer command printed in your log. May I know if you have commented this out or have you done any modification in the application here?

    gst_wrapper: GstCmdString:
    appsrc format=GST_FORMAT_TIME is-live=true do-timestamp=true block=false name=myAppSrc0 ! queue
    ! video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, format=(string)NV12, interlace-mode=(string)progressive, colorimetry=(string)bt601
    ! v4l2h264enc bitrate=10000000
    ! h264parse
    ! mp4mux
    ! filesink location=output_video_0.mp4

    Regards,

    Nikhil

  • Yes, I made the following modifications.

    GstCmdString:
    appsrc format=GST_FORMAT_TIME is-live=true do-timestamp=true block=false name=myAppSrc0 ! queue
    ! video/x-raw, width=(int)1280, height=(int)720, framerate=(fraction)30/1, format=(string)NV12, interlace-mode=(string)progressive, colorimetry=(string)bt601
    ! v4l2h264enc bitrate=1500000
    ! multifilesink max-files=5 max-size-time=35000000 location=output_video_0.h264_%d

  • Hi,

    format=(string)NV12

    ok, could you confirm if you are sending NV12 into encoder or is it YUV422? 
    I thought your sensor datatype was YUV422.

    I checked internally, and YUV422 is not supported format as input to the encoder. Hence you would have to send NV12 into the encoder. 
    Have you taken care of this conversion from YUV422 to NV12?

    Regards,

    Nikhil

  • By converting yuv422 to nv12 through LDC nodes and encoding it, the data format should be fine.

  • It is yuv420 nv12  format as input to the encoder. 

  • Hello, may I ask if I need to change '/vision_apps/modules/include/ldc_lut_1920x1080. h' to 'ldc_lut_1280x720. h'? If changes are needed, how can I obtain the LDC_ Lut_ 1280x720. h.

  • Perhaps you can provide the "DCC Tuning tool" for this distortion processing.

  • Hi,

    If you are using LDC just for conversion from YUV422 to NV12, then you can skip dcc_config and set it as NULL in the node.

    i.e., tivxVpacLdcNode(graph, config, NULL, NULL, NULL, NULL, NULL, input_img, output_img, NULL);

    Could you please try the above?

    Regards,

    Nikhil 

  • Ok, I'll give it a try. However, if format conversion is only done through LDC nodes, how can the distortion of fish eye cameras be resolved?

  • Hi,

    sorry for the misunderstanding here. I had thought that you already had a corrected image. 
    Yes, in order to do the fish-eye correction, you would require the updated LUT from the DCC Tuning Tool.

    I shall connect you to the Tuning expert who would respond to this thread.

    Meanwhile could you confirm if the fish-eyed image is being encoded correctly without any deformity?

    Regards,

    Nikhil

  • It's okay, currently it can be encoded, but the encoded video image has some distortion. I think it's possible to change '/vision_ Apps/modules/include/ldc_ Lut_ 1920x1080. h 'to' ldc_ Lut_ 1280x720. h 'to solve distortion, is this feasible?

  • Please follow this FAQ for fisheye distortion correction -- https://e2e.ti.com/support/processors-group/processors/f/processors-forum/1058565/faq-tda4vm-how-to-create-a-ldc-mesh-lut-for-fisheye-distortion-correction-on-tda4?keyMatch=LDC%20FAQ

    You may start with the single-cam app first to get LDC to work properly for your lens.

  • Yes, I agree with your statement. Can you provide the DCC Tuning Tool? This tool seems to have to be used, but we cannot obtain it.

  • If you have NDA in place, you may apply for the latest version here -- https://www.ti.com/licreg/docs/swlicexportcontrol.tsp?form_id=276074&prod_no=ADAS-SW-IMAGING&ref_url=adas

    Please also check with your local FAE support.

    BTW, V2.5 should be OK for LDC.