This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TDA4VH-Q1: encode nv12 to h264 with image error

Part Number: TDA4VH-Q1

I have written a program that encodes an offline nv12 file into h264, but some edges of the encoded h264 file image have horizontal stripes。
I have checked the nv12 file and there is no such issue.please help me check what the problem may be.

h264 file:left.zip

source code: image_encode.zip

  • Hi, 

    I will look into this. What HLOS and SDK version are you using?

    Thanks,
    Sarabesh S.

  • HLOS: Linux

    SDK: 9.2

  • Hi, 

    I tested this with using the following GStreamer pipeline and was able to produce a successful h264 encode without any horizontal stripes:

    • gst-launch-1.0 filesrc location=/bbb_1080p60_30s.yuv ! rawvideoparse width=1920 height=1080 format=nv12 framerate=60/1 colorimetry=bt709 ! v4l2h264enc ! filesink location=/bbb_1080p60_gstenc_30s.264

    I see you are using the V4L2 API directly and not GStreamer. Please reference the app_multi_cam_codec demo in ti-processor-sdk-rtos-j7284s4-evm-09_02_00_05/vision_apps/apps/basic_demos/app_multi_cam_codec. This demonstrates using a GStreamer pipeline in a OpenVX app to display a stream. 

    Thanks,
    Sarabesh S.

  • I tested the gst-launch-1.0 command and the encoding result is correct. Thanks for your support.
    I will check the differences on my source code, or directly use the sample app_multi_cam_codec.

  • Glad to hear this resolved your issue. Let me know if anything else comes up. 

    BR,
    Sarabesh S.

  • If I replace v4l2 buf memory with V4L2_MEMORY_MMAP in the code, it will work properly.

    req.type = V4L2_BUF_TYPE_VIDEO_OUTPUT_MPLANE;
    req.memory = V4L2_MEMORY_MMAP;

    The previously problematic code, v4l2 memory, is V4L2_MEMORY_DMABUF. I will copy the nv12 buffer to the vx_image structure, and then pass the fd obtained through appMemGetDmaBufFd to v4l2(getImageDmaFdIMG()). I still don't understand why there are problems with using this method.

    req.type = V4L2_BUF_TYPE_VIDEO_OUTPUT_MPLANE;
    req.memory = V4L2_MEMORY_DMABUF;

    I still want to perform v4l2 encoding operations without copying memory.

  • Hello, 

    I have not tested using dmabuf encoding operations directly with the v4l2 api. Our SDK utilizes gstreamer to execute all encoding operations, including dmabuf to avoid mem copies. You can reference how dmabuf is used in the OpenVX app_multi_cam_codec with under the sinkType conditions.

    What is the streaming pipeline you are trying to implement here? Is it camera capture to display? or camera capture to filesink?

    BR,
    Sarabesh S.