Hi,
I am trying to stream H.265 encoded video from a camera (imx219) with RTP to a multicast group. When I start the client before the server, things work, when I start the client after the server, I get a green picture.
I think this is because the client can't decode the stream.
I use the following gstreamer pipeline to record from camera and stream to the network:
MULTICASTADDR=224.1.1.1 media-ctl -V '"imx219 4-0010":0 [fmt:SRGGB10_1X10/1920x1080 field:none]' gst-launch-1.0 -v v4l2src device=/dev/video3 io-mode=dmabuf-import \ ! video/x-bayer, width=1920, height=1080, framerate=30/1, format=rggb10 \ ! tiovxisp sink_0::device=/dev/v4l-subdev2 sensor-name="SENSOR_SONY_IMX219_RPI" \ dcc-isp-file=/opt/imaging/imx219/linear/dcc_viss_10b.bin \ sink_0::dcc-2a-file=/opt/imaging/imx219/linear/dcc_2a_10b.bin format-msb=9 \ ! video/x-raw, format=NV12, width=1920, height=1080, framerate=30/1 \ ! v4l2h265enc \ ! rtph265pay config-interval=1 pt=96 \ ! udpsink host=${MULTICASTADDR} auto-multicast=true port=5000
On a client, I use the following pipeline to receive the RTP stream, decode it, and write the frames to jpeg:
MULTICASTADDR=224.1.1.1 gst-launch-1.0 -v udpsrc multicast-group=${MULTICASTADDR} auto-multicast=true multicast-iface=enx0c37960132b8 port=5000 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H265, payload=(int)96" \ ! rtpjitterbuffer latency=50 \ ! rtph265depay \ ! h265parse \ ! avdec_h265 \ ! video/x-raw,framerate=30/1 \ ! jpegenc \ ! multifilesink location="frame%08d.jpg"
Any idea how to make it that a client can be started after the server? I'm guessing it has to do with the H.265 encoding, where it doesn't send something that the client needs.
Any help would be appreciated.
Regards,
Bas Vermeulen