This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

AM625: Need help with streaming camera output on display.

Part Number: AM625

Tool/software:

Hello Ti experts,

I'm using OV5647 camera with AM625 and now I can capture the frames in SBGGR10 format.
For viewing those captured raw frames, I first convert them into 8 bit raw RGB format, and then I'm converting it to .mkv(If video) and .png(If Image)
1. For converting from SBGGR10 to 8 bit raw RGB I've a python script as

import numpy as np

# Read the 10-bit raw data
with open('capture', 'rb') as f:
    raw_data = np.fromfile(f, dtype=np.uint16)

# Scale the 10-bit data to 8-bit by dividing by 4 (right shift by 2) and applying a brightness factor
# Adjust brightness_factor as needed
brightness_factor = 15
raw_data_scaled = np.clip((raw_data >> 2) * brightness_factor, 0, 255).astype(np.uint8)

# Save the adjusted 8-bit data
raw_data_scaled.tofile('video_8bit.raw')


2. And the command to convert the bayer_bgg8 to rgb24 I've used
ffmpeg -f rawvideo -pix_fmt bayer_bggr8 -s 640x480 -i video_8bit.raw -vf format=rgb24 -c:v ffv1 -level 3 test.mkv

So, using this 2 parts I need to do these steps

Please let me know how to automate this process and directly display the camera output on HDMI display in rgb24 format

Best Regards,
Aditya T

  • Hello Aditya,

    Since the AM625 doesn't have an ISP, why do you use OV5647? I would recommend you to use OV5640 and avoid all those conversions.

    Regards,

    Jianzhong

  • Hello,

    I know this, but in my hardware we've connected OV5647 and there is not an other option for it, the only 2 options are OV5647 or OV2740, so I've to achieve this on both.
    Also, in my end application I've to stream the camera output on browser, like use camera for google meet.
    So, if there's any way to stream the camera data on display while capturing it would help me.

    Thanks & Regards,
    Aditya T

  • It is not recommended to use a sensor without ISP on AM62x. Using the CPU to do raw image processing would not give you good performance or good image quality.

  • Well, I know that. But, then also we have decided to use it.

    So if you can help me with live streaming the data through internet like using google meet, it would help me a alot.

    Best regards,

    Aditya T 

  • Hi Aditya,

    As Jianzhong mentioned, on AM625, the operations would be CPU intensive and would have lot of performance issues in terms of quality, latency etc... 

    But here is one pipeline that you can try and test it out: The below pipeline would capture from camera (OV5640) and stream the capture to network and also render it to display.

    gst-launch-1.0 -v gst-launch-1.0 -v v4l2src device=/dev/video2 ! video/x-raw, format=1920, height=1080, format=UYVY, framerate=30/1 ! \
    tee name=tee_split0 \
    tee_split0. ! queue ! tiscaler name=split_01 \
    tee_split0. ! queue ! tiscaler name=split_02 \
    split_01. ! queue ! video/x-raw, width=1920, height=1080 ! x264enc ! rtph264pay ! udpsink host=<ip addr> port=5000  
    split_02. ! queue ! video/x-raw, width=1280, height=720 ! kmssink driver-name=tidss sync=false

    Hope this helps.

    Best Regards,

    Suren

  • Hello suren,

    As per my knowledge AM625 doesn't supports x264enc. Could you please confirm on this?
    Also after giving the command without entering the stream to capture the network part in the command, I've got below mentioned output

    root@am62xx-evm:~# GST_DEBUG=3 gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=4 ! video/x-bayer, width=640, height=480, format=bggr, framerate=60/1 ! bayer2rgb ! videoconvert ! video/x-raw, format=NV12, width=640, height=480, framerate=60/1 ! tee name=tee_split0 tee_split0. ! queue ! tiscaler ! video/x-raw, format=NV12, width=640, height=480, framerate=60/1 ! queue ! kmssink driver-name=tidss sync=false
    Setting pipeline to PAUSED ...
    Pipeline is live and does not need PREROLL ...
    /GstPipeline:pipeline0/GstKMSSink:kmssink0: display-width = 1366
    /GstPipeline:pipeline0/GstKMSSink:kmssink0: display-height = 768
    Pipeline is PREROLLED ...
    Setting pipeline to PLAYING ...
    New clock: GstSystemClock
    /GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-bayer, width=(int)640, height=(int)480, format=(string)bggr, framerate=(fraction)60/1, interlace-mode=(string)progressive, colorimetry=(string)sRGB
    /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-bayer, width=(int)640, height=(int)480, format=(string)bggr, framerate=(fraction)60/1, interlace-mode=(string)progressive, colorimetry=(string)sRGB
    /GstPipeline:pipeline0/GstBayer2RGB:bayer2rgb0.GstPad:src: caps = video/x-raw, width=(int)640, height=(int)480, framerate=(fraction)60/1, interlace-mode=(string)progressive, colorimetry=(string)sRGB, format=(string)RGBx
    /GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps = video/x-raw, width=(int)640, height=(int)480, framerate=(fraction)60/1, interlace-mode=(string)progressive, format=(string)NV12, colorimetry=(string)2:4:7:1
    /GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-raw, width=(int)640, height=(int)480, framerate=(fraction)60/1, interlace-mode=(string)progressive, format=(string)NV12, colorimetry=(string)2:4:7:1
    /GstPipeline:pipeline0/GstTee:tee_split0.GstTeePad:src_0: caps = video/x-raw, width=(int)640, height=(int)480, framerate=(fraction)60/1, interlace-mode=(string)progressive, format=(string)NV12, colorimetry=(string)2:4:7:1
    /GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-raw, width=(int)640, height=(int)480, framerate=(fraction)60/1, interlace-mode=(string)progressive, format=(string)NV12, colorimetry=(string)2:4:7:1
    /GstPipeline:pipeline0/GstTee:tee_split0.GstPad:sink: caps = video/x-raw, width=(int)640, height=(int)480, framerate=(fraction)60/1, interlace-mode=(string)progressive, format=(string)NV12, colorimetry=(string)2:4:7:1
    /GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-raw, width=(int)640, height=(int)480, framerate=(fraction)60/1, interlace-mode=(string)progressive, format=(string)NV12, colorimetry=(string)2:4:7:1
    /GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-raw, width=(int)640, height=(int)480, framerate=(fraction)60/1, interlace-mode=(string)progressive, format=(string)NV12, colorimetry=(string)2:4:7:1
    /GstPipeline:pipeline0/GstTIScaler:tiscaler0.GstPad:src: caps = video/x-raw, format=(string)NV12, width=(int)640, height=(int)480, framerate=(fraction)60/1
    /GstPipeline:pipeline0/GstCapsFilter:capsfilter2.GstPad:src: caps = video/x-raw, format=(string)NV12, width=(int)640, height=(int)480, framerate=(fraction)60/1
    /GstPipeline:pipeline0/GstQueue:queue1.GstPad:sink: caps = video/x-raw, format=(string)NV12, width=(int)640, height=(int)480, framerate=(fraction)60/1
    /GstPipeline:pipeline0/GstCapsFilter:capsfilter2.GstPad:sink: caps = video/x-raw, format=(string)NV12, width=(int)640, height=(int)480, framerate=(fraction)60/1
    /GstPipeline:pipeline0/GstCapsFilter:capsfilter2.GstPad:sink: caps = video/x-raw, format=(string)NV12, width=(int)640, height=(int)480, framerate=(fraction)60/1
    /GstPipeline:pipeline0/GstKMSSink:kmssink0.GstPad:sink: caps = video/x-raw, format=(string)NV12, width=(int)640, height=(int)480, framerate=(fraction)60/1
    /GstPipeline:pipeline0/GstTIScaler:tiscaler0.GstPad:sink: caps = video/x-raw, width=(int)640, height=(int)480, framerate=(fraction)60/1, interlace-mode=(string)progressive, format=(string)NV12, colorimetry=(string)2:4:7:1
    /GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink: caps = video/x-raw, width=(int)640, height=(int)480, framerate=(fraction)60/1, interlace-mode=(string)progressive, colorimetry=(string)sRGB, format=(string)RGBx
    /GstPipeline:pipeline0/GstBayer2RGB:bayer2rgb0.GstPad:sink: caps = video/x-bayer, width=(int)640, height=(int)480, format=(string)bggr, framerate=(fraction)60/1, interlace-mode=(string)progressive, co[  421.431104] v4l2_get_link_freq: Link frequency estimated using pixel rate: result might be inaccurate
    lorimetry=(string)sRGB
    /GstPipeline:pipeline0/GstCapsFilter:cap[  421.442435] v4l2_get_link_freq: Consider implementing support for V4L2_CID_LINK_FREQ in the transmitter driver
    sfilter0.GstPad:sink: caps = video/x-bayer, width=(int)640, heig[  421.457995] csi2rx_configure_external_dphy: Link frequency is 137500000
    ht=(int)480, format=(string)bggr, framerate=(fraction)60/1, interlace-mode=(string)progressive, colorimetry=(string)sRGB
    0:00:00.164498695  2322     0x28092640 WARN              video-info video-info.c:191:validate_colorimetry: color matrix RGB is only supported with RGB format, ENCODED is not
    0:00:00.498098655  2322     0x28092640 WARN              video-info video-info.c:515:gst_video_info_from_caps: invalid colorimetry, using default
    0:00:00.498401800  2322     0x28092640 WARN              video-info video-info.c:191:validate_colorimetry: color matrix RGB is only supported with RGB format, ENCODED is not
    0:00:00.498429395  2322     0x28092640 WARN              video-info video-info.c:515:gst_video_info_from_caps: invalid colorimetry, using default
    0:00:00.499164700  2322     0x28092640 WARN              video-info video-info.c:191:validate_colorimetry: color matrix RGB is only supported with RGB format, ENCODED is not
    0:00:00.499215585  2322     0x28092640 WARN              video-info video-info.c:515:gst_video_info_from_caps: invalid colorimetry, using default
    0:00:00.499329595  2322     0x28092640 WARN              video-info video-info.c:191:validate_colorimetry: color matrix RGB is only supported with RGB format, ENCODED is not
    0:00:00.499351555  2322     0x28092640 WARN              video-info video-info.c:515:gst_video_info_from_caps: invalid colorimetry, using default
    0:00:00.499415045  2322     0x28092640 WARN              video-info video-info.c:191:validate_colorimetry: color matrix RGB is only supported with RGB format, ENCODED is not
    0:00:00.499435470  2322     0x28092640 WARN              video-info video-info.c:515:gst_video_info_from_caps: invalid colorimetry, using default
    0:00:00.502781725  2322     0x28092640 WARN          v4l2bufferpool gstv4l2bufferpool.c:855:gst_v4l2_buffer_pool_start:<v4l2src0:pool0:src> Uncertain or not enough buffers, enabling copy threshold
    ^Chandling interrupt.
    Interrupt: Stopping pipeline ...
    Execution ended after 0:03:38.178326005
    Setting pipeline to NULL ...
    Freeing pipeline ...
    root@am62xx-evm:~# 
    
    

    Here I need to interrupt the command as it was not streaming the capture data on display.

    Kind Regards,
    Aditya T 

  • Hi Aditya,

    Instead of streaming to kmssink, can you try and run the rgb output to waylandsink directly, does that work. Also if the resolution is 640x480, why would you want to use  tiscaler? 

    Best Regards,

    Suren

  • Hello,

    Instead of streaming to kmssink, can you try and run the rgb output to waylandsink directly, does that work

    Well, I've tried both waylandsink as well as kmssink, both of them executes but nothing is displayed on HDMI display. 

    Also if the resolution is 640x480, why would you want to use  tiscaler? 

    Actually I dont know why is tiscaler used, so I didn't removed it from pipeline. Also the resolution I'm using is supported by camera sensor directly. 

    The sensor I'm using is OV5647, which provides raw frames of 640×480 with SBGGR10 frame format by default. 

    Kind Regards,

    Aditya T 

  • Hi Aditya,

    Can you dump the stream from camera using v4l2-ctl command and share the file for --stream-count=10 or 20 frames?

    Simple pipeline to test would be:

    camera-> bayer2rgb-> display.   add a videoconvert before display if its not rendering on the display.

    Best Regards,

    Suren

  • Hello Suren,

    Can you dump the stream from camera using v4l2-ctl command and share the file for --stream-count=10 or 20 frames?

    The command to capture frames using v4l2-ctl is

    root@am62xx-evm:~# v4l2-ctl -d0 --stream-mmap -v width=640,height=480,pixelformat=BG10 --stream-count=20 --stream-to=frames.raw
    [ 3620.595692] v4l2_get_link_freq: Link frequency estimated using pixel rate: result might be inaccurate
    [ 3620.605045] v4l2_get_link_freq: Consider implementing support for V4L2_CID_LINK_FREQ in the transmitter driver
    [ 3620.615051] csi2rx_configure_external_dphy: Link frequency is 137500000
    <<<<<<<<<<<<<<<<<<<<
    root@am62xx-evm:~# 


    I've attached the output file: Link

    Simple pipeline to test would be:

    camera-> bayer2rgb-> display.   add a videoconvert before display if its not rendering on the display.

    I've used this pipeline only, but I'm unable to render the camera data on display. The commands I've used to render are

    root@am62xx-evm:~# gst-launch-1.0 v4l2src device="/dev/video0" ! video/x-bayer, width=640, height=480, format=bggr ! bayer2rgb ! videoconvert ! video/x-raw, format=RGB ! autovideosink
    Setting pipeline to PAUSED ...
    Pipeline is live and does not need PREROLL ...
    Got context from element 'autovideosink0': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"\(GstGLDisplayWayland\)\ gldisplaywayland0";
    Pipeline is PREROLLED ...
    Setting pipeline to PLAYING ...
    New clock: GstSystemClock
    [ 4012.993353] v4l2_get_link_freq: Link frequency estimated using pixel rate: result might be inaccurate
    [ 4013.002655] v4l2_get_link_freq: Consider implementing support for V4L2_CID_LINK_FREQ in the transmitter driver
    [ 4013.012729] csi2rx_configure_external_dphy: Link frequency is 137500000
    ^Chandling interrupt.
    Interrupt: Stopping pipeline ...
    Execution ended after 0:00:07.878785396
    Setting pipeline to NULL ...
    Freeing pipeline ...
    
    
    root@am62xx-evm:~# gst-launch-1.0 v4l2src device="/dev/video0" ! video/x-bayer, width=640, height=480, format=bggr ! bayer2rgb ! videoconvert ! video/x-raw, format=RGB ! waylandsink sync=false 
    Setting pipeline to PAUSED ...
    Pipeline is live and does not need PREROLL ...
    Pipeline is PREROLLED ...
    Setting pipeline to PLAYING ...
    New clock: GstSystemClock
    [ 4084.606715] v4l2_get_link_freq: Link frequency estimated using pixel rate: result might be inaccurate
    [ 4084.615992] v4l2_get_link_freq: Consider implementing support for V4L2_CID_LINK_FREQ in the transmitter driver
    [ 4084.626007] csi2rx_configure_external_dphy: Link frequency is 137500000
    ^Chandling interrupt.
    Interrupt: Stopping pipeline ...
    Execution ended after 0:00:07.073372142
    Setting pipeline to NULL ...
    Freeing pipeline ...
    root@am62xx-evm:~# 


    I think that the pipeline is ok but still camera data is not rendering on display, please help me with this

    Best Regards,
    Aditya T

  • Hello Suren,

    Can you help me with how to switch the 10 bit raw output to 8 bit raw from the sensor to the Processor?

    Best Regards,
    Aditya T

  • Hi Aditya,

    There are some patches that were posted to convert 10-bit to 8-bit. Let me follow up with the SW team and get back to you early next week.

    Best Regards,

    Suren

  • Sure sir

    Kind Regards,
    Aditya T

  • Hello Suren,

    Is there any update on how to capture 8-bit RAW images using OV5647?

    Kind Regards,

    Aditya T 

  • Hello ,

    Please update the status for patches to convert 10-bit raw to 8-bit raw for OV5647 camera sensor

    Kind Regards,
    Aditya T

  • Hello Aditya,

    The patches to convert 10-bit to 8-bit are only available on AM62A.

    Regards,

    Jianzhong

  • Hello,

    The patches to convert 10-bit to 8-bit are only available on AM62A.

    Okay, but can you still post the patch link here. I'll give it a try for AM625.
    The reason I need it because the ffmpeg tool also doesn't supports 10-bit raw data i.e. SBGGR10. Then how can I render the camera data on display while camera is capturing frames?

    Best Regards,
    Aditya T

  • Aditya,

    The 10-bit to 8-bit conversion is performed by the ISP on AM62A. The AM625 doesn't have the ISP and so that patch is not applicable to AM62.

    Regards,

    Jianzhong

  • Hello,

    The 10-bit to 8-bit conversion is performed by the ISP on AM62A. The AM625 doesn't have the ISP and so that patch is not applicable to AM62.

    Okay, but can you provide the patches for AM62A, I'll try it for AM625.

    Best Regards,
    Aditya T

  • Hi Aditya,

    As Jianzhong mentioned, there is no ISP on AM62x SOC. You would not be able to apply those patches from AM62A to AM625. 
    They are specific to VPAC on AM62A. 

    We would suggest you use camera’s with in-built ISP on AM62x.

    Best Regards,

    Suren

  • Okay, But how to render the live feed of OV5647 camera on the HDMI display attached to AM625 SOC?

    Kind Regards,
    Aditya T

  • You're using the wrong device and your inquiry is beyond our ability to support you.

  • Okay,

    Thanks & Regards,

    Aditya T 

  • Hello,

    Could you let me know whether OV2312 will work with AM625?
    Below is block diagram and product specs for OV2312


    Best Regards,
    Aditya T

  • The OV2312 has RGB-Ir RAW output which also requires an ISP. Therefore it won't work with AM625, but this sensor is supported on AM62A.

    What is your application? Does it require color images? If not, you can use some monochrome sensor which doesn't require an ISP.

  • My end application is like google meet, in which I need to stream live feed of camera from browser.

  • Then you need to use a sensor which provides YUV output.

  • Okay, Can you provide me a list of cameras which has inbuilt ISP and also supported by AM625

    Best Regards,
    Aditya T

  • Hi Aditya,

    Please see our Academy where we have published the camera's that we have tested with AM62x.

    https://dev.ti.com/tirex/explore/node?node=A__ATwtsfUx0sv.24yK9KmHcQ__AM62-ACADEMY__uiYMDcq__LATEST 

    Linux->Evaluating Linux -> Camera 

    Section Tested CSI camera's.

    Hope this helps.

    Best Regards,

    Suren

  • Well,
    the link you provided has only 3 cameras, from which 2 are already OV5640 and the 3rd one i.e. OV9281: Link doesn't have ISP.
    I need options other than these.

    Kind Regards,
    Aditya T

  • Okay, Thank you and
    But have you tested OV9281 with AM625?

  • Hi Aditya,

    We haven’t tested this on AM625.

    But it’s USB based camera with MJPG and YUV formats supported atleast from what I have seen from the below link:

    https://a.co/d/3hY9wy2

    Yes you should be able to test this on AM62x.

    Best Regards,

    Suren

  • Okay, Thank you for your support

  • Hello Jianzhong,

    Recently I've purchased 1 camera module from Digilent with ov5640 camera sensor in it.
    Now I'm able to see live feed of camera to display. But how to use this camera on browser?

    Like as I told earlier, I need to use this camera for an application like google meet, so how to get access of this ov5640 on browser?

    Best Regards,
    Aditya T

  • Hi Aditya,

    Please open a new thread regarding this new topic.

    Thank you.

    Jianzhong