I am trying to get started streaming from a camera input to display on my host machine. Are there any steps to get started?
This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
I am trying to get started streaming from a camera input to display on my host machine. Are there any steps to get started?
From the perspective of the Linux SDK here are steps to stream camera input to an external display monitor or stream on host machine.
gst-launch-1.0 v4l2src device=/dev/video2 ! image/jpeg, width=1280, height=720 ! jpegparse ! jpegdec ! video/x-raw ! videoconvert ! kmssink driver-name=tidss plane-id=41 sync=false
m=video 8887 RTP.AVP 96 a=rtpmap:96 H264/90000 c=IN IP4 127.0.0.1 a=framerate:30
gst-launch-1.0 v4l2src device=/dev/video2 ! image/jpeg, width=1280, height=720 ! jpegparse ! jpegdec ! video/x-raw ! videoconvert ! v4l2h264enc extra-controls="enc,prepend_sps_and_pps_to_idr=1,video_gop_size=5" ! rtph264pay ! udpsink host=<yourip> port=<sdp port specified>Modify the pipeline according to the camera's resolution, input format, and device probing. For my particular camera, it captures 1280x720p jpeg images so I specified decoding them to be reformatted to raw-video to be encoded to h264 in the pipeline. Additionally, specify the host ip and sdp port from the .sdp file created earlier.
ffplay -protocol_whitelist "file,rtp,udp" -i 8887.sdp
If the peripherals are not being probed by the board or having boot problems here are some debug steps: