AM62A7: v4l2h264enc0: failing with "Maybe be due to not enough memory or failing driver"

Part Number: AM62A7

Tool/software:

We have an opportunity for an AM62A7 design that involves performing encode on an imx462 camera (very similar to the imx290).

We have a setup running on edge AI SDK 10.0 with imx290.c patches to support imx462.

I am able to stream the sensor data to a connected HDMI display using the following pipeline:

gst-launch-1.0 -v v4l2src device=/dev/video3 io-mode=dmabuf-import ! \
  video/x-bayer, width=1920, height=1080, framerate=60/1, format=rggb10 ! \
  tiovxisp sink_0::device=/dev/v4l-subdev2 sensor-name="SENSOR_SONY_IMX219_RPI" \
  dcc-isp-file=/opt/imaging/imx219/linear/dcc_viss_10b.bin sink_0::dcc-2a-file=/opt/imaging/imx219/linear/dcc_2a_10b.bin format-msb=9 ! \
  video/x-raw, format=NV12, width=1920, height=1080, framerate=60/1 ! fpsdisplaysink name=fpssink text-overlay=false video-sink="kmssink driver-name=tidss"

Now we are trying to run an encode to stream remotely (or to a file), but the command we are using (below) that is based on the SDK documentation is failing:

root@mitysom-am62ax:~# gst-launch-1.0 v4l2src device=/dev/video3 io-mode=dmabuf-import ! \
  video/x-bayer,width=1920,height=1080, framerate=30/1, format=rggb10 ! \
  tiovxisp sensor-name=SENSOR_SONY_IMX219_RPI \
     dcc-isp-file=/opt/imaging/imx219/linear/dcc_viss.bin \
     sink_0::dcc-2a-file=/opt/imaging/imx219/linear/dcc_2a.bin \
     sink_0::device=/dev/v4l-subdev2 format-msb=9 ! \
  video/x-raw,format=NV12 ! \
  v4l2h264enc output-io-mode=dmabuf-import ! \
  udpsink port=8081 host=10.0.103.185
APP: Init ... !!!
256986.587068 s: MEM: Init ... !!!
256986.587142 s: MEM: Initialized DMA HEAP (fd=9) !!!
256986.587316 s: MEM: Init ... Done !!!
256986.587331 s: IPC: Init ... !!!
256986.607655 s: IPC: Init ... Done !!!
REMOTE_SERVICE: Init ... !!!
REMOTE_SERVICE: Init ... Done !!!
256986.613127 s: GTC Frequency = 200 MHz
APP: Init ... Done !!!
256986.613277 s:  VX_ZONE_INFO: Globally Enabled VX_ZONE_ERROR
256986.613293 s:  VX_ZONE_INFO: Globally Enabled VX_ZONE_WARNING
256986.613305 s:  VX_ZONE_INFO: Globally Enabled VX_ZONE_INFO
256986.614699 s:  VX_ZONE_INFO: [tivxPlatformCreateTargetId:134] Added target MPU-0
256986.615101 s:  VX_ZONE_INFO: [tivxPlatformCreateTargetId:134] Added target MPU-1
256986.615452 s:  VX_ZONE_INFO: [tivxPlatformCreateTargetId:134] Added target MPU-2
256986.615800 s:  VX_ZONE_INFO: [tivxPlatformCreateTargetId:134] Added target MPU-3
256986.615852 s:  VX_ZONE_INFO: [tivxInitLocal:126] Initialization Done !!!
256986.615866 s:  VX_ZONE_INFO: Globally Disabled VX_ZONE_INFO
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Pipelin[ 5630.968268] imx290 1-001a: imx290_start_streaming
e is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
[ 5631.117923] imx290 1-001a: imx290_start_streaming : 0
Redistribute latency...
ERROR: from element /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0: Failed to process frame.
Additional debug info:
/usr/src/debug/gstreamer1.0-plugins-good/1.22.12/sys/v4l2/gstv4l2videoenc.c(901): gst_v4l2_video_enc_handle_frame (): /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0:
Maybe be due to not enough memory or failing driver
Execution ended after 0:00:00.238726669
Setting pipeline to NULL ...
Freeing pipeline ...
256986.998696 s:  VX_ZONE_WARNING: [vxReleaseContext:1275] Found a reference 0xffff800ce340 of type 0000080f at external count 1, internal count 0, releasing it
256986.998751 s:  VX_ZONE_WARNING: [vxReleaseContext:1277] Releasing reference (name=image_104) now as a part of garbage collection
256986.998777 s:  VX_ZONE_WARNING: [vxReleaseContext:1275] Found a reference 0xffff8014bcf0 of type 00000813 at external count 1, internal count 0, releasing it
256986.998792 s:  VX_ZONE_WARNING: [vxReleaseContext:1277] Releasing reference (name=object_array_106) now as a part of garbage collection
256986.999170 s:  VX_ZONE_WARNING: [vxReleaseContext:1275] Found a reference 0xffff8014bea0 of type 00000813 at external count 1, internal count 0, releasing it
256986.999187 s:  VX_ZONE_WARNING: [vxReleaseContext:1277] Releasing reference (name=object_array_108) now as a part of garbage collection
APP: Deinit ... !!!
REMOTE_SERVICE: Deinit ... !!!
REMOTE_SERVICE: Deinit ... Done !!!
256987.004787 s: IPC: Deinit ... !!!
256987.005464 s: IPC: DeInit ... Done !!!
256987.005521 s: MEM: Deinit ... !!!
256987.005786 s: DDR_SHARED_MEM: Alloc's: 25 alloc's of 38823759 bytes
256987.005836 s: DDR_SHARED_MEM: Free's : 25 free's  of 38823759 bytes
256987.005848 s: DDR_SHARED_MEM: Open's : 0 allocs  of 0 bytes
256987.005868 s: MEM: Deinit ... Done !!!
APP: Deinit ... Done !!!

Specifically, we are seeing:

ERROR: from element /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0: Failed to process frame.
Additional debug info:
/usr/src/debug/gstreamer1.0-plugins-good/1.22.12/sys/v4l2/gstv4l2videoenc.c(901): gst_v4l2_video_enc_handle_frame (): /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0:
Maybe be due to not enough memory or failing driver
Execution ended after 0:00:00.238726669

If I change the pipeline to remove the "output-io-mode" field from the v4l2h264enc, then the pipeline runs but no output is generated (no network packets sent using udpsink, only a header generated using filesink).

Can you help us debug?

Thanks.

MIke

  • Hi Mike,

    In order to route the encoded stream via udpsink, you would need to package them using rtph264pay element. Can you try adding this element in your pipeline before udpsink and give it a try.

    Best Regards,

    Suren

  • Hi Suren,

    Thanks for catching the pipeline error.  However I still get the same behavior.

    root@mitysom-am62ax:~# gst-launch-1.0 v4l2src device=/dev/video3 io-mode=dmabuf-import ! \
    >   video/x-bayer,width=1920,height=1080, framerate=30/1, format=rggb10 ! \
    >   tiovxisp sensor-name=SENSOR_SONY_IMX219_RPI \
    >      dcc-isp-file=/opt/imaging/imx219/linear/dcc_viss.bin \
    >      sink_0::dcc-2a-file=/opt/imaging/imx219/linear/dcc_2a.bin \
    >      sink_0::device=/dev/v4l-subdev2 format-msb=9 ! \
    >   video/x-raw,format=NV12 ! \
    >   v4l2h264enc output-io-mode=dmabuf-import ! \
    >   rtph264pay ! \
    >   udpsink port=8081 host=10.0.103.185
    APP: Init ... !!!
     67111.732423 s: MEM: Init ... !!!
     67111.732492 s: MEM: Initialized DMA HEAP (fd=8) !!!
     67111.732666 s: MEM: Init ... Done !!!
     67111.732681 s: IPC: Init ... !!!
     67111.753465 s: IPC: Init ... Done !!!
    REMOTE_SERVICE: Init ... !!!
    REMOTE_SERVICE: Init ... Done !!!
     67111.758857 s: GTC Frequency = 200 MHz
    APP: Init ... Done !!!
     67111.759009 s:  VX_ZONE_INFO: Globally Enabled VX_ZONE_ERROR
     67111.759024 s:  VX_ZONE_INFO: Globally Enabled VX_ZONE_WARNING
     67111.759036 s:  VX_ZONE_INFO: Globally Enabled VX_ZONE_INFO
     67111.760481 s:  VX_ZONE_INFO: [tivxPlatformCreateTargetId:134] Added target MPU-0
     67111.760924 s:  VX_ZONE_INFO: [tivxPlatformCreateTargetId:134] Added target MPU-1
     67111.761314 s:  VX_ZONE_INFO: [tivxPlatformCreateTargetId:134] Added target MPU-2
     67111.761637 s:  VX_ZONE_INFO: [tivxPlatformCreateTargetId:134] Added target MPU-3
     67111.761682 s:  VX_ZONE_INFO: [tivxInitLocal:126] Initialization Done !!!
     67111.761761 s:  VX_ZONE_INFO: Globally Disabled VX_ZONE_INFO
    Setting pipeline to PAUSED ...
    Pipeline is live and does not need PREROLL ...
    Pipeline is PREROLLED ...
    Setting pipeline to PLAYING ...
    New clock: [67102.656315] imx290 1-001a: imx290_start_streaming
    GstSystemClock
    [67102.806344] imx290 1-001a: imx290_start_streaming : 0
    Redistribute latency...
    ERROR: from element /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0: Failed to process frame.
    Additional debug info:
    /usr/src/debug/gstreamer1.0-plugins-good/1.22.12/sys/v4l2/gstv4l2videoenc.c(901): gst_v4l2_video_enc_handle_frame (): /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0:
    Maybe be due to not enough memory or failing driver
    Execution ended after 0:00:00.235766241
    Setting pipeline to NULL ...
    Freeing pipeline ...
     67112.146393 s:  VX_ZONE_WARNING: [vxReleaseContext:1275] Found a reference 0xffff8951e340 of type 0000080f at external count 1, internal count 0, releasing it
     67112.146443 s:  VX_ZONE_WARNING: [vxReleaseContext:1277] Releasing reference (name=image_104) now as a part of garbage collection
     67112.146469 s:  VX_ZONE_WARNING: [vxReleaseContext:1275] Found a reference 0xffff8959bcf0 of type 00000813 at external count 1, internal count 0, releasing it
     67112.146484 s:  VX_ZONE_WARNING: [vxReleaseContext:1277] Releasing reference (name=object_array_106) now as a part of garbage collection
     67112.146890 s:  VX_ZONE_WARNING: [vxReleaseContext:1275] Found a reference 0xffff8959bea0 of type 00000813 at external count 1, internal count 0, releasing it
     67112.146908 s:  VX_ZONE_WARNING: [vxReleaseContext:1277] Releasing reference (name=object_array_108) now as a part of garbage collection
    APP: Deinit ... !!!
    REMOTE_SERVICE: Deinit ... !!!
    REMOTE_SERVICE: Deinit ... Done !!!
     67112.152413 s: IPC: Deinit ... !!!
     67112.153052 s: IPC: DeInit ... Done !!!
     67112.153103 s: MEM: Deinit ... !!!
     67112.153213 s: DDR_SHARED_MEM: Alloc's: 25 alloc's of 38823759 bytes
     67112.153233 s: DDR_SHARED_MEM: Free's : 25 free's  of 38823759 bytes
     67112.153245 s: DDR_SHARED_MEM: Open's : 0 allocs  of 0 bytes
     67112.153262 s: MEM: Deinit ... Done !!!
    APP: Deinit ... Done !!!
    

  • Hi Michael, 

    I see you are still using IMX219 ISP related files. Are you trying out with IMX219? 

    Can you share the output of media-ctl -p /dev/media0

    Also, can you share the pipeline where you are able to stream the camera capture to display without the encode?

    Does the below pipeline work:

    gst-launch-1.0 -v v4l2src device=/dev/video3 io-mode=dmabuf-import ! \
      video/x-bayer, width=1920, height=1080, framerate=60/1, format=rggb10 ! \
      tiovxisp sink_0::device=/dev/v4l-subdev2 sensor-name="SENSOR_SONY_IMX219_RPI" \
      dcc-isp-file=/opt/imaging/imx219/linear/dcc_viss_10b.bin sink_0::dcc-2a-file=/opt/imaging/imx219/linear/dcc_2a_10b.bin format-msb=9 ! \
      video/x-raw, format=NV12, width=1920, height=1080, framerate=60/1 ! queue ! \
      v4l2h264enc ! rtph264pay ! udpsink host=10.0.103.185 port=5000

    Best Regards,

    Suren

  • Hi Suren,

    I am using the IMX219 ISP as a starting point.  The IMX462 uses the same frame size and color format (rggb10), and is properly debayered when I present it to an attached display using the IMX219 ISP.   At the moment, the customer just wants to see encode performance, not ISP performance beyond basic debayering.  We will tune if we win the opportunity.

    This is the output of media-ctl -p /dev/media0:

    Media controller API version 6.6.58
    
    Media device information
    ------------------------
    driver          j721e-csi2rx
    model           TI-CSI2RX
    serial
    bus info        platform:30102000.ticsi2rx
    hw revision     0x1
    driver version  6.6.58
    
    Device topology
    - entity 1: 30102000.ticsi2rx (7 pads, 7 links, 1 route)
                type V4L2 subdev subtype Unknown flags 0
                device node name /dev/v4l-subdev0
            routes:
                    0/0 -> 1/0 [ACTIVE]
            pad0: Sink
                    [stream:0 fmt:SRGGB10_1X10/1920x1080 field:none]
                    <- "cdns_csi2rx.30101000.csi-bridge":1 [ENABLED,IMMUTABLE]
            pad1: Source
                    [stream:0 fmt:SRGGB10_1X10/1920x1080 field:none]
                    -> "30102000.ticsi2rx context 0":0 [ENABLED,IMMUTABLE]
            pad2: Source
                    -> "30102000.ticsi2rx context 1":0 [ENABLED,IMMUTABLE]
            pad3: Source
                    -> "30102000.ticsi2rx context 2":0 [ENABLED,IMMUTABLE]
            pad4: Source
                    -> "30102000.ticsi2rx context 3":0 [ENABLED,IMMUTABLE]
            pad5: Source
                    -> "30102000.ticsi2rx context 4":0 [ENABLED,IMMUTABLE]
            pad6: Source
                    -> "30102000.ticsi2rx context 5":0 [ENABLED,IMMUTABLE]
    
    - entity 9: cdns_csi2rx.30101000.csi-bridge (5 pads, 2 links, 1 route)
                type V4L2 subdev subtype Unknown flags 0
                device node name /dev/v4l-subdev1
            routes:
                    0/0 -> 1/0 [ACTIVE]
            pad0: Sink
                    [stream:0 fmt:SRGGB10_1X10/1920x1080 field:none]
                    <- "imx462 1-001a":0 [ENABLED,IMMUTABLE]
            pad1: Source
                    [stream:0 fmt:SRGGB10_1X10/1920x1080 field:none]
                    -> "30102000.ticsi2rx":0 [ENABLED,IMMUTABLE]
            pad2: Source
            pad3: Source
            pad4: Source
    
    - entity 15: imx462 1-001a (1 pad, 1 link, 0 routes)
                 type V4L2 subdev subtype Sensor flags 0
                 device node name /dev/v4l-subdev2
            pad0: Source
                    [stream:0 fmt:SRGGB10_1X10/1920x1080 field:none colorspace:raw xfer:none ycbcr:601 quantization:full-range
                     crop.bounds:(0,0)/1945x1097
                     crop:(12,8)/1920x1080]
                    -> "cdns_csi2rx.30101000.csi-bridge":0 [ENABLED,IMMUTABLE]
    
    - entity 21: 30102000.ticsi2rx context 0 (1 pad, 1 link)
                 type Node subtype V4L flags 0
                 device node name /dev/video3
            pad0: Sink
                    <- "30102000.ticsi2rx":1 [ENABLED,IMMUTABLE]
    
    - entity 27: 30102000.ticsi2rx context 1 (1 pad, 1 link)
                 type Node subtype V4L flags 0
                 device node name /dev/video4
           pad0: Sink
                    <- "30102000.ticsi2rx":2 [ENABLED,IMMUTABLE]
    
    - entity 33: 30102000.ticsi2rx context 2 (1 pad, 1 link)
                 type Node subtype V4L flags 0
                 device node name /dev/video5
            pad0: Sink
                    <- "30102000.ticsi2rx":3 [ENABLED,IMMUTABLE]
    
    - entity 39: 30102000.ticsi2rx context 3 (1 pad, 1 link)
                 type Node subtype V4L flags 0
                 device node name /dev/video6
            pad0: Sink
                    <- "30102000.ticsi2rx":4 [ENABLED,IMMUTABLE]
    
    - entity 45: 30102000.ticsi2rx context 4 (1 pad, 1 link)
                 type Node subtype V4L flags 0
                 device node name /dev/video7
            pad0: Sink
                    <- "30102000.ticsi2rx":5 [ENABLED,IMMUTABLE]
    
    - entity 51: 30102000.ticsi2rx context 5 (1 pad, 1 link)
                 type Node subtype V4L flags 0
                 device node name /dev/video8
            pad0: Sink
                    <- "30102000.ticsi2rx":6 [ENABLED,IMMUTABLE]
    

    This pipeline command allows streaming images to a connected display running 1080P60 (using DPI output to HDMI converter chip):

    gst-launch-1.0 -v v4l2src device=/dev/video3 io-mode=dmabuf-import ! \
      video/x-bayer, width=1920, height=1080, framerate=60/1, format=rggb10 ! \
      tiovxisp sink_0::device=/dev/v4l-subdev2 sensor-name="SENSOR_SONY_IMX219_RPI" \
      dcc-isp-file=/opt/imaging/imx219/linear/dcc_viss_10b.bin sink_0::dcc-2a-file=/opt/imaging/imx219/linear/dcc_2a_10b.bin format-msb=9 ! \
      video/x-raw, format=NV12, width=1920, height=1080, framerate=60/1 ! fpsdisplaysink name=fpssink text-overlay=false video-sink="kmssink driver-name=tidss"
    

    The pipeline you supplied runs, but no data / images appear on the remote media player. 

    I changed the sink to a filesync and run the capture for 1 minute and hit CTLR-C and got a 348 kB file that would not play using Ubuntus Video player ("empty file") or VLC (no images).  I attached the file.

    gst-launch-1.0 -v v4l2src device=/dev/video3 io-mode=dmabuf-import ! \
      video/x-bayer, width=1920, height=1080, framerate=60/1, format=rggb10 ! \
      tiovxisp sink_0::device=/dev/v4l-subdev2 sensor-name="SENSOR_SONY_IMX219_RPI" \
      dcc-isp-file=/opt/imaging/imx219/linear/dcc_viss_10b.bin sink_0::dcc-2a-file=/opt/imaging/imx219/linear/dcc_2a_10b.bin format-msb=9 ! \
      video/x-raw, format=NV12, width=1920, height=1080, framerate=60/1 ! queue ! \
      v4l2h264enc ! rtph264pay ! filesink location=test.mp4
    

    I am wondering if we have something fundamentally broken with encode on my system. I also tried this pipeline, and copied the resulting file to a linux and windows machine.  Neither would play the file.

    gst-launch-1.0 videotestsrc num-buffers=200 ! \
      video/x-raw,format=NV12,width=1920,height=1080,framerate=30/1 ! \
      v4l2h264enc ! \
      rtph264pay ! \
      filesink location=/tmp/foo.mp4
    

    We are using SDK 10.1.  I am noticing some messages related to the video-codec in dmesg that might indicate problem?

    root@mitysom-am62ax:~# dmesg | grep dec
    ...
    [ 1664.205264] vdec 30210000.video-codec: Runtime PM usage count underflow!
    [ 1664.301382] vdec 30210000.video-codec: error -ENXIO: IRQ index 0 not found
    [ 1664.308385] vdec 30210000.video-codec: failed to get irq resource, falling back to polling
    [ 1664.316982] vdec 30210000.video-codec: OPP table not found in device tree
    [ 1664.327788] vdec 30210000.video-codec: Added wave5 driver with caps: 'ENCODE' 'DECODE'
    [ 1664.335777] vdec 30210000.video-codec: Product Code:      0x521c
    [ 1664.341890] vdec 30210000.video-codec: Firmware Revision: 334314
    
    

    With regards,

    Mike

  • Hi Mike,

    rtph264pay is used only when streaming encoded stream on network using udpsink. 

    In the above 2 pipelines, you are using rtph264pay and dumping into a filesink.

    Instead can you run a simple pipeline like this and see if you are able to play the encoded file.

    gst-launch-1.0 videotestsrc num-buffers=200 ! \
    video/x-raw,format=NV12,width=1920,height=1080,framerate=30/1 ! \
    v4l2h264enc ! filesink location=test-video.h264

    or 

    gst-launch-1.0 videotestsrc num-buffers=200 ! \
    video/x-raw,format=NV12,width=1920,height=1080,framerate=30/1 ! \
    v4l2h264enc ! mp4mux ! filesink location=test.mp4

    If this works, then replace the video testsrc with v4l2src which is basically your camera src and see if you are able to encode.

    Best Regards,

    Suren

  • Hi Suren,

    Thank you for pointing that out. Ok.  The first pipeline encodes and I can play it back with VLC player.  The second does not with a "WARNING: erroneous pipeline: could not link v4l2h264enc0 to mp4mux0" error.

    So I ran this pipeline and got a playable file.

    gst-launch-1.0 v4l2src device=/dev/video3 io-mode=dmabuf-import num-buffers=200 ! \
      video/x-bayer,width=1920,height=1080, framerate=30/1, format=rggb10 ! \
      tiovxisp sensor-name=SENSOR_SONY_IMX219_RPI \
         dcc-isp-file=/opt/imaging/imx219/linear/dcc_viss.bin \
         sink_0::dcc-2a-file=/opt/imaging/imx219/linear/dcc_2a.bin \
         sink_0::device=/dev/v4l-subdev2 format-msb=9 ! \
      video/x-raw,format=NV12 ! \
      v4l2h264enc ! \
      filesink location=test_cam.h264
    

    I will need to figure out why streaming is not working (the only difference is adding the rtph264pay and sending to udpsink).  Perhaps I am suffering packet loss or something.

    Is there a way to set the bitrate and display if any frames are dropped entering the encode phase?  Our customer wants evidence that the encoder can keep up with the data and will not drop frames / present holes in the video stream.  If this information is somewhere on the Acedemy pages I can read through those.

    Thank you again,

    Mike

     

  • Mike,

    In order to use mp4mux, you may have to add h264parse element before that something like this below:

    v4l264enc ! queue ! h264parse ! mp4mux ! filesink location=test.mp4

    Also since you are able to validate that encode is functional with Camera source, I assume your IP address is not being blocked by your IT security or firewall restrictions.

    We have validated Video streaming on various applications.  We haven't see any performance degradation with regards to encoding. Please refer to my application note that I have published here

    Best Regards,

    Suren