Hello,
I have a couple of questions regarding the use of the VPE and Ducatih264enc elements together in Gstreamer on the AM5728 processor using the TI Processor SDK Linux distribution. I'm trying to generate a pipeline that pulls in raw video from a USB 3.0 camera, uses the VPE element to convert it from YUYV to NV12, and encodes it using the Ducati H264 hardware encoder. I'm having issues interfacing with the VPE element, specifically relating to DMA buffer allocation.
The kernel module appears to be loaded correctly, so my issue seems to be with my userspace implementation. My ultimate goal is to use the Appsrc with a camera that has a custom Api, but if I can get a pipeline working with V4l2src or Videotestsrc, that would help as well. Is there some documentation or sample code on allocating and sharing buffers between my source, VPE, and encoder properly?
My pipeline looks roughly like this
Source (v4l2/app/videotest) ! capsfilter YUYV 1920x1080@30p ! vpe ! capsfilter NV12 1920x1080@30p ! ducatih264enc ! h264parse ! rtph264pay ! udpsink
And the errors that this returns are generally related to the DMA buffer handling, so I believe that's where my problem lies. Depending on what options I pass to the VPE and the source, I get errors such as "basesrc internal data flow error,", "vpebufferpool alloc function failed", or "gstvpe this plugin does not support buffers not allocated by self"
I have not found any sample pipelines or code that use the VPE in conjunction with a raw video source and the Ducati encoder, so any info on that would be appreciated. Please let me know if you have any info that could help me solve my problem.
Thank you
Lucas