This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

How to show frames through kmssink and show it on HDMI display connected to the board using cv2 and python

Other Parts Discussed in Thread: TDA4VM

Hi,

I am using TDA4VM (J721E) board with SDK 09_00_00_00.

I can read video stream from USB camera using gstreamer in python using opencv. I am using my custom compiled models to take inference on those frames. Now, I want to show the results through HDMI port on the display connected to the board. I can show frames on the display but it is not real time there is big delay for each frame while putting frames in the buffer. I am attaching my python script. Kindly let me know the reason for the delay and possible solution for it.

import cv2
import gi
import numpy as np
import time

gi.require_version("Gst", "1.0")
from gi.repository import Gst, GObject, GLib

Gst.init(None)
pipeline_str = ( "appsrc name=source ! tiovxdlcolorconvert ! video/x-raw,format=NV12 ! kmssink driver-name=tidss sync=true")
pipeline = Gst.parse_launch(pipeline_str)
source = pipeline.get_by_name("source")
pipeline.set_state(Gst.State.PLAYING)

##### gst input pipelin
gst_pipeline = 'v4l2src device=/dev/video-usb-cam0 io-mode=2 ! image/jpeg, width=1280, height=720 ! jpegdec ! tiovxdlcolorconvert ! video/x-raw, format=NV12 ! \
tiovxmultiscaler ! video/x-raw, format=NV12 ! tiovxdlcolorconvert ! video/x-raw, format=RGB ! videoconvert ! video/x-raw, format=BGR ! queue ! appsink'

cap = cv2.VideoCapture(gst_pipeline, cv2.CAP_GSTREAMER)

raw_frame_len = 1280 * 720

try:
while True:
ret, frame = cap.read()
if not ret:
break

raw_frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)
raw_data = raw_frame.flatten()

# Create a GStreamer buffer with raw data
gst_buffer = Gst.Buffer.new_allocate(None, len(raw_data), None)
########################################### works with delay
gst_buffer.fill(0, raw_data)
############################################################
caps = Gst.caps_from_string(f"video/x-raw,format=RGB,width={frame.shape[1]},height={frame.shape[0]}")
source.set_property("caps", caps)
source.emit("push-buffer", gst_buffer)

except KeyboardInterrupt:
pass
finally:
# Stop the pipeline and clean up
pipeline.set_state(Gst.State.NULL)
cap.release()




Thanks and regards,
Sourabh