Because of the holidays, TI E2E™ design support forum responses will be delayed from Dec. 25 through Jan. 2. Thank you for your patience.

This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

AM625: Errors in configuring camera OV5647 with Linux in SD card on TI's AM625-EVM

Part Number: AM625

Tool/software:

Hello TI experts,

I'm having some issues with getting Camera up on AM625 EVM, please help me with it.
The sensor in the camera which I'm trying to use right now is named as OV5647. Hence I've created modules for the same


After adding the driver support for camera OV5647, I'm able to observe the devcie node in /dev folder. But when trying to capture Image/Video using this camera getting some error logs
The error logs are:

root@am62xx-evm:~# cam -c1 --stream width=640,height=480,pixelformat=SBGGR -C10
[0:10:25.391269170] [2793]  INFO Camera camera_manager.cpp:298 libcamera v0.0.5+dirty (2024-03-23T14:12:31+00:00)
[0:10:25.405864620] [2794]  WARN CameraSensor camera_sensor.cpp:244 'ov5647 4-0036': Recommended V4L2 control 0x009a0922 not supported
[0:10:25.406011655] [279[  625.621421] v4l2_get_link_freq: Link frequency estimated using pixel rate: result might be inaccurate
4]  WARN CameraSensor camera_sensor.cpp:311[  625.632827] v4l2_get_link_freq: Consider implementing support for V4L2_CID_LINK_FREQ in the transmitter driver
 'ov5647 4-0036': The sensor kernel driver needs to b[  625.648320] csi2rx_configure_external_dphy: Link frequency is 137500000
e fixed
[0:10:25.406049495] [2794]  WARN CameraSensor camera_sensor.cpp:313 'ov5647 4-0036': See Documentation/sensor_driver_requirements.rst in the libcamera sources for more information
[0:10:25.408144865] [2794]  WARN CameraSensor camera_sensor.cpp:459 'ov5647 4-0036': Failed to retrieve the camera location
Camera configuration adjusted
Using camera /base/bus@f0000/i2c@20020000/i2c-switch@71/i2c@1/camera@36 as cam0
[0:10:25.410748370] [2793]  INFO Camera camera.cpp:1028 configuring streams: (0) 640x480-SBGGR10
cam0: Capture 10 frames

The last printed is "capturing 10 frames", but it never stops I need to give an interrupt to stop the command.
Also one of the line in the output of this command states that "Recommended V4L2 control not supported" : What does it really means?

The device is listed successfully as given below:
root@am62xx-evm:~# media-ctl -p
Media controller API version 6.1.80
Media device information
------------------------
driver          j721e-csi2rx
model           TI-CSI2RX
serial
bus info        platform:30102000.ticsi2rx
hw revision     0x1
driver version  6.1.80
- entity 13: ov5647 4-0036 (1 pad, 1 link, 0 route)
             type V4L2 subdev subtype Sensor flags 0
             device node name /dev/v4l-subdev2
        pad0: Source
                [stream:0 fmt:SBGGR10_1X10/640x480 field:none colorspace:srgb
                 crop.bounds:(16,16)/2592x1944
                 crop:(32,16)/2560x1920]
                -> "cdns_csi2rx.30101000.csi-bridge":0 [ENABLED,IMMUTABLE]


root@am62xx-evm:~# v4l2-ctl --list-devices
j721e-csi2rx (platform:30102000.ticsi2rx):
    /dev/video0
    /dev/video1
    /dev/video2
    /dev/video3
    /dev/media0


By observing the above output, I guess the camera is listed and configured successfully.
So please help me with how to overcome this issue?

Kind Regards,
Aditya T

  • Hello Aditya,

    Can you try if yavta capture can work? That can rule out any libcamera issue.

    Thanks,

    Jianzhong

  • Hello,
    The output after using yavta is given below

    root@am62xx-evm:~# yavta -c -Fcapture -s 640x480 -f SBGGR10 /dev/video0
    Device /dev/video0 opened.
    Device `j721e-csi2rx' on `platform:30102000.ticsi2rx' (driver 'j721e-csi2rx') supports video, capture, without mplanes.
    Video format set: SBGGR10 (30314742) 640x480 (stride 1280) field none buffer size 614400
    Video format: SBGGR10 (30314742) 640x480 (stride 1280) field none buffer size 614400
    8 buffers requested.
    length: 614400 offset: 0 timestamp type/source: mono/EoF
    Buffer 0/0 mapped at address 0xffffa47fa000.
    length: 614400 offset: 614400 timestamp type/source: mono/EoF
    Buffer 1/0 mapped at address 0xffffa4764000.
    length: 614400 offset: 1228800 timestamp type/source: mono/EoF
    Buffer 2/0 mapped at address 0xffffa46ce000.
    length: 614400 offset: 1843200 timestamp type/source: mono/EoF
    Buffer 3/0 mapped at address 0xffffa4638000.
    length: 614400 offset: 2457600 timestamp type/source: mono/EoF
    Buffer 4/0 mapped at address 0xffffa45a2000.
    length: 614400 offset: 3072000 timestamp type/source: mono/EoF
    Buffer 5/0 mapped at address 0xffffa450c000.
    length: 614400 offset: 3686400 timestamp type/source: mono/EoF
    Buffer 6/0 mapped at address 0xffffa4476000.
    length: 614400 offset: 4300800 timestamp type/source: mono/EoF
    Buffer 7/0 mapped at address 0xffffa43e0000.
    Unable to start streaming: Broken pipe (32).
    8 buffers released.
    root@am62xx-evm:~# 


    Getting error for "unable to start streaming"

    Thanks & Regards,
    Aditya T

  • Hi Aditya,

    Can you share the complete log of "media-ctl -p"?

    Thanks,

    Jianzhong

  • Hello,
    I guess there was an issue with the hardware itself.
    Now, after I've changed the camera module I'm able to successfully capture the .uyvy images.
    Please let me know how to convert the .uyvy to .jpeg or any other image format.
     For your reference I've attached sample .uyvy file below


    This is the .uyvy file which I'm viewing online on ImagetoSTL webpage

    Thanks & Regards,
    Aditya T

  • Hello Aditya,

    You can run a gstreamer pipeline with "jpegenc" plugin to convert YUV to JPG.

    By the way, the picture you showed was either captured with wrong format or displayed in wrong format. What was the command you used to capture the image?

    Regards,

    Jianzhong

  • Hello,

    By the way, the picture you showed was either captured with wrong format or displayed in wrong format. What was the command you used to capture the image?

    The command which I used to capture the .uyvy images is

    root@am62xx-evm:~# cam -c1 --stream width=640,height=480,pixelformat=UYVY -C20 -F#.uyvy
    [0:02:25.511697610] [1800]  INFO Camera camera_manager.cpp:298 libcamera v0.0.5+dirty (2024-03-23T14:12:31+00:00)
    [0:02:25.534200425] [1801]  WARN CameraSensor camera_sensor.cpp:244 'ov5647 4-0036': Recommended V4L2 control 0x009a0922 not supported
    [0:02:25.534363590] [180[  145.751186] v4l2_get_link_freq: Link frequency estimated using pixel rate: result might be inaccurate
    1]  WARN CameraSensor camera_sensor.cpp:311[  145.761327] v4l2_get_link_freq: Consider implementing support for V4L2_CID_LINK_FREQ in the transmitter driver
     'ov5647 4-0036': The sensor kernel driver needs to be fixed
    [0:02:25.534403125] [1801]  WARN CameraSensor camera_sensor.cpp:313 'ov5647 4-0036': See Documentation/sensor_driver_requirements.rst in the libcamera sources for more information
    [0:02:25.537434015] [1801]  WARN CameraSensor camera_sensor.cpp:459 'ov5647 4-0036': Failed to retrieve the camera location
    Camera configuration adjusted
    Using camera /base/bus@f0000/i2c@20020000/i2c-switch@71/i2c@1/camera@36 as cam0
    [0:02:25.540510640] [1800]  INFO Camera camera.cpp:1028 configuring streams: (0) 640x480-SBGGR10
    cam0: Capture 20 frames
    145.633463 (0.00 fps) cam0-stream0 seq: 000000 bytesused: 614400
    145.698542 (15.37 fps) cam0-stream0 seq: 000001 bytesused: 614400
    145.715217 (59.97 fps) cam0-stream0 seq: 000002 bytesused: 614400
    145.731869 (60.05 fps) cam0-stream0 seq: 000003 bytesused: 614400
    145.748540 (59.98 fps) cam0-stream0 seq: 000004 bytesused: 614400
    145.765216 (59.97 fps) cam0-stream0 seq: 000005 bytesused: 614400
    145.798547 (30.00 fps) cam0-stream0 seq: 000006 bytesused: 614400
    145.815211 (60.01 fps) cam0-stream0 seq: 000007 bytesused: 614400
    145.848554 (29.99 fps) cam0-stream0 seq: 000008 bytesused: 614400
    145.865214 (60.02 fps) cam0-stream0 seq: 000009 bytesused: 614400
    145.881912 (59.89 fps) cam0-stream0 seq: 000010 bytesused: 614400
    145.915245 (30.00 fps) cam0-stream0 seq: 000011 bytesused: 614400
    145.948565 (30.01 fps) cam0-stream0 seq: 000012 bytesused: 614400
    145.965232 (60.00 fps) cam0-stream0 seq: 000013 bytesused: 614400
    145.998557 (30.01 fps) cam0-stream0 seq: 000014 bytesused: 614400
    146.015226 (59.99 fps) cam0-stream0 seq: 000015 bytesused: 614400
    146.031892 (60.00 fps) cam0-stream0 seq: 000016 bytesused: 614400
    146.065236 (29.99 fps) cam0-stream0 seq: 000017 bytesused: 614400
    146.081898 (60.02 fps) cam0-stream0 seq: 000018 bytesused: 614400
    146.115230 (30.00 fps) cam0-stream0 seq: 000019 bytesused: 614400
    root@am62xx-evm:~#

    What should I change in this command?

    You can run a gstreamer pipeline with "jpegenc" plugin to convert YUV to JPG

    Can you tell how to use gstreamer pipeline with required plugins?
    Also In the Ti-apps-launcher application how to open camera after clicking on "Live Camera"?

    Thanks & Regards,
    Aditya T

  • What should I change in this command?

    Can you try: "cam -c1 --stream width=640,height=480,pixelformat=YUYV -C20 -F#.uyvy" or "cam -c1 --stream width=640,height=480,pixelformat=YVYU -C20 -F#.uyvy"?

    Can you tell how to use gstreamer pipeline with required plugins?

    gst-launch-1.0 v4l2src num-buffers=5 device="/dev/video0" ! video/x-raw,width=640,height=480 ! multifilesink location="image-%d.yuv"

  • Hello,

    Can you try: "cam -c1 --stream width=640,height=480,pixelformat=YUYV -C20 -F#.uyvy" or "cam -c1 --stream width=640,height=480,pixelformat=YVYU -C20 -F#.uyvy"?

    I tried to capture images using this both commands, the commands are been executing successfully but the output is still the same. i.e. the green image. But this time the captured image is plain green not like the previous one.

    Also using this command "gst-launch-1.0 v4l2src num-buffers=5 device="/dev/video0" ! video/x-raw,width=640,height=480 ! multifilesink location="image-%d.yuv"" didn't changed anyting and you didn't even used the "jpegenc" plugin in gstreamer pipeline.

    The basic task we want to perform is viewing live camera on HDMI display.
    So, how to open camera application on display?

    Kind Regards,
    Aditya T

  • From the media-ctl log you provided in your initial post, the sensor has "SBGGR10" format, which is a raw Bayer format. Can you please confirm?

    If that's the case, you'll need to have an ISP in order to view live camera. The AM625 SoC doesn't have an ISP and you'll need to use an AM62A device.

  • From the media-ctl log you provided in your initial post, the sensor has "SBGGR10" format, which is a raw Bayer format. Can you please confirm?

    Yes, the sensor I'm using has "SBGGR10" format.

    If that's the case, you'll need to have an ISP in order to view live camera. The AM625 SoC doesn't have an ISP and you'll need to use an AM62A device.

    It means I can't use this camera sensor for webRTC or any other camera streaming application? Is my this thinking correct?

    Thanks & Regards,

    Aditya T

  • Hello,

    An update in camera OV5647:
    The yavta command is been successfully executed now and output is

    root@am62xx-evm:~# media-ctl -V '"ov5647 4-0036":0 [fmt:SBGGR10_1X10/640x480 field:none colorspace:srgb]'
    root@am62xx-evm:~# yavta -c -Fcapture -s 640x480 -f SBGGR10 /dev/video0 -c20
    Device /dev/video0 opened.
    Device `j721e-csi2rx' on `platform:30102000.ticsi2rx' (driver 'j721e-csi2rx') supports video, capture, without mplanes.
    Video f[ 4613.393151] v4l2_get_link_freq: Link frequency estimated using pixel rate: result might be inaccurate
    ormat set: SBGGR10 (30314742) 640x480 (stride 1280) field none b[ 4613.406371] v4l2_get_link_freq: Consider implementing support for V4L2_CID_LINK_FREQ in the transmitter driver
    uffer size 614400
    Video format: SBGGR10 (30314742) 640x480 (stride 1280) field none buffer size 614400
    8 buffers requested.
    length: 614400 offset: 0 timestamp type/source: mono/EoF
    Buffer 0/0 mapped at address 0xffff81c6a000.
    length: 614400 offset: 614400 timestamp type/source: mono/EoF
    Buffer 1/0 mapped at address 0xffff81bd4000.
    length: 614400 offset: 1228800 timestamp type/source: mono/EoF
    Buffer 2/0 mapped at address 0xffff81b3e000.
    length: 614400 offset: 1843200 timestamp type/source: mono/EoF
    Buffer 3/0 mapped at address 0xffff81aa8000.
    length: 614400 offset: 2457600 timestamp type/source: mono/EoF
    Buffer 4/0 mapped at address 0xffff81a12000.
    length: 614400 offset: 3072000 timestamp type/source: mono/EoF
    Buffer 5/0 mapped at address 0xffff8197c000.
    length: 614400 offset: 3686400 timestamp type/source: mono/EoF
    Buffer 6/0 mapped at address 0xffff818e6000.
    length: 614400 offset: 4300800 timestamp type/source: mono/EoF
    Buffer 7/0 mapped at address 0xffff81850000.
    0 (0) [-] any 0 614400 B 4613.249863 4613.249928 66.103 fps ts mono/EoF
    1 (1) [-] any 1 614400 B 4613.314943 4613.315010 15.366 fps ts mono/EoF
    2 (2) [-] any 2 614400 B 4613.331610 4613.336330 59.999 fps ts mono/EoF
    3 (3) [-] any 3 614400 B 4613.348284 4613.357480 59.974 fps ts mono/EoF
    4 (4) [-] any 4 614400 B 4613.364938 4613.378381 60.046 fps ts mono/EoF
    5 (5) [-] any 5 614400 B 4613.381605 4613.399249 59.999 fps ts mono/EoF
    6 (6) [-] any 6 614400 B 4613.398275 4613.420268 59.988 fps ts mono/EoF
    7 (7) [-] any 7 614400 B 4613.414944 4613.441278 59.992 fps ts mono/EoF
    8 (0) [-] any 8 614400 B 4613.431609 4613.462132 60.006 fps ts mono/EoF
    9 (1) [-] any 9 614400 B 4613.448275 4613.483041 60.002 fps ts mono/EoF
    10 (2) [-] any 10 614400 B 4613.464951 4613.503898 59.966 fps ts mono/EoF
    11 (3) [-] any 11 614400 B 4613.481612 4613.524831 60.020 fps ts mono/EoF
    12 (4) [-] any 12 614400 B 4613.498276 4613.545731 60.010 fps ts mono/EoF
    13 (5) [-] any 13 614400 B 4613.514946 4613.566609 59.988 fps ts mono/EoF
    14 (6) [-] any 14 614400 B 4613.531618 4613.587455 59.981 fps ts mono/EoF
    15 (7) [-] any 15 614400 B 4613.548288 4613.608329 59.988 fps ts mono/EoF
    16 (0) [-] any 16 614400 B 4613.564946 4613.628935 60.031 fps ts mono/EoF
    17 (1) [-] any 17 614400 B 4613.581622 4613.649482 59.966 fps ts mono/EoF
    18 (2) [-] any 18 614400 B 4613.598284 4613.670202 60.017 fps ts mono/EoF
    19 (3) [-] any 19 614400 B 4613.614959 4613.690902 59.970 fps ts mono/EoF
    Captured 20 frames in 0.456166 seconds (43.843630 fps, 26937526.181498 B/s).
    8 buffers released.
    root@am62xx-evm:~# 
    
    


    Please let me know if this is helpful, also what more should I try?

    Kind Regards,
    Aditya T

  • Yes, the sensor I'm using has "SBGGR10" format.

    Please let me know if this is helpful, also what more should I try?

    You can use a desktop tool like pixelviewer to see if your captured BGGR10 images look good.

    It means I can't use this camera sensor for webRTC or any other camera streaming application? Is my this thinking correct?

    That is correct. You'll need to use an SoC with ISP (such as AM62A) to work with your sensor. Alternatively, you can use a sensor with inbuilt ISP such as OV5640.

  • Hello,
    Is their any way by which we can do the ISP only in software, or hardware ISP is required at any end i.e. on camera module or on AM625?

    Thanks & Regards,
    Aditya T

  • Doing ISP in SW is very costly. I would recommend you to use a sensor with inbuilt ISP. There are many such sensors on the market.

    Regards,

    Jianzhong

  • Hello,

    Doing ISP in SW is very costly.

    I've already purchased camera modules with sensor OV5647 and now we've used it in our product also.
    Now I've no other option that performing ISP from software side. So, please provide me with steps for ISP from software only for AM625 Linux.

    Also, I've a question as
    1. If the sensor OV5647 wasn't having inbuilt ISP and AM625 also isn't having it, then why is that camera module's support is provided in menuconfig tool?

    2. What if I used this camera? Link Would I still need to perform ISP for this camera?

    Thanks & Regards,
    Aditya T

  • So, please provide me with steps for ISP from software only for AM625 Linux.

    I don't have this information.

    1. If the sensor OV5647 wasn't having inbuilt ISP and AM625 also isn't having it, then why is that camera module's support is provided in menuconfig tool?

    The menuconfig is a general configuration for AM6x family. OV5647 works with AM62A.

    2. What if I used this camera? Link Would I still need to perform ISP for this camera?

    This camera also outputs raw RGB format.

    You'll need to use a sensor that outputs YUV or RGB888 or similar formats.

  • Hello,

    I don't have this information.

    Okay, but can you guide or provide any reference to it?

    The menuconfig is a general configuration for AM6x family. OV5647 works with AM62A.

    Okay.

    This camera also outputs raw RGB format.

    But, it has inbuilt ISP in sensor right? So, I can convert those raw RGB to any format I wan't. Correct me if I'm wrong.

    You'll need to use a sensor that outputs YUV or RGB888 or similar formats.

    I guess, we can convert raw RGB to any format i.e. YUV or RGB888. In real the camera module I'm going to use is "YDS-C8MF-OV2740", and the product guide says it gives 10 bit RAW RGB. So, tell me if this is okay or not.

    Also the block diagram says it has inbuilt ISP in it.

    Kind Regards,
    Aditya T 

  • Also the block diagram says it has inbuilt ISP in it.

    The "ISP" inside this sensor probably does some sensor defect correction, but not the Image Signal Processor that I was talking about. 

    If a sensor's output is RAW RGB, you'll need an ISP to processor the raw data and produce YUV or RGB images.

  • Hello,

    If a sensor's output is RAW RGB, you'll need an ISP to processor the raw data and produce YUV or RGB images.

    Isn't it possible through software? 

    Correct me if I'm wrong, but I guess we can convert the RAW RGB signals to YUV or RGB through software only?

    The "ISP" inside this sensor probably does some sensor defect correction, but not the Image Signal Processor that I was talking about. 

    Then can you suggest a camera module which will give YUV or RGB as output signals directly, or any other format is fine for me to use the camera module for webRTC like applications? 

    Thanks & Regards,

    Aditya T

  • Correct me if I'm wrong, but I guess we can convert the RAW RGB signals to YUV or RGB through software only?

    The image signal processing for RAW RGB signals is more than just demosaicing (converting RAW RGB to YUV/RGB). I'm not aware of any SW that can do ISP.

    Then can you suggest a camera module which will give YUV or RGB as output signals directly, or any other format is fine for me to use the camera module for webRTC like applications? 

    Yes, the OV5640 sensor has an inbuilt ISP and is supported on AM62x.