This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
Hi,
a customer is using a custom board with TDA4VM and GMSL serdes, and software based on SDK6.02 RTOS with added GMSL drivers to capture from AR0820/GMSL and that works (resolution 3840x2160, format RAW12).
We are investigating how quickly we can upgrade to SDK7.01, but for now need to use SDK6.02.
Now the customer needs to capture UYUV format with resolution 1920x1080.
The camera that used supports 1920x1080 / UYVY and they created a basic driver for the new cameras using the AR0820 references. However, they were not able to capture images if this resolution was defined. They could capture a kind of corrupted image by selecting 3840x2160 resolution and overriding video Data-Type to use RAW12 (0x2C). They also had to define the same camera identifier (dccId) on "Iss Sensor Parameter", because a custom value didn't allow me to run the driver.
Questions:
1) On the IssSensor_CreateParams structure, there's an "dccId" field that has to be filled. What is this? How can we define a new dccId if I'm adding a new camera driver?
2) Inside "imaging/sensor_drv/include/" directory, there are dcc_* files for each camera. Do we need to generate dcc_* files if a new image sensor driver is added? How are those files generated? How are they used by Vision Apps?
3) we also can find inside each sensor driver directory, the following directories: dcc_bins, dcc_xmls. Do we need similar files for my new sensor driver? How do I generate those?
4) How can we support UYVY capture? we see data-format is defined in IssSensor_CreateParams. In the case of the AR0820, TIVX_RAW_IMAGE_16_BIT is defined as a pixel container. we believe we can use the same definition for UYVY as 2 bytes-per-pixel are needed, right? But, where do we need to define the pixel format (UYVY / Bayer)?
5) We have been using app_single_cam and app_multi_cam (from Vision Apps) for testing. There is a line in the code:
vx_uint32 num_bytes_per_pixel = 2; /*Mayank : Hardcoded to 12b Unpacked format*/
Does it mean that those apps only work for RAW12? Can they capture UYVY or a different format with this application?
6) From the tests they see that capture works only if the 3840x2160 resolution is defined for the driver. So, the driver didn't work if defined 1920x1080 resolution and they also tested the same for the AR0820 driver and it didn't work. I suspect that the resolution is hardcoded somewhere. Do you have any ideas about what the problem could be? Is it related to Imaging or Vision Apps?
7) When they use 1920x1080 resolution, the driver seems to load fine when they start app_single_cam, however it seems to be sucked on vxGraphParameterDequeueDoneRef(). Errors cannot be seen, but capture doesn't work. Can debug be enabled from OpenVX. Is there any trace or tool to debug capture issues?
Thanks,
--Gunter
Hello Gunter,
1) On the IssSensor_CreateParams structure, there's an "dccId" field that has to be filled. What is this? How can we define a new dccId if I'm adding a new camera driver?
For UYVY input ISP is bypassed and DCC is not needed. In the sensor driver you may remove ISS_SENSOR_FEATURE_DCC_SUPPORTED from ISS_SENSOR_AR0820_FEATURES
2) Inside "imaging/sensor_drv/include/" directory, there are dcc_* files for each camera. Do we need to generate dcc_* files if a new image sensor driver is added? How are those files generated? How are they used by Vision Apps?
No need to worry about it if DCC is not supported by the sensor.
3) we also can find inside each sensor driver directory, the following directories: dcc_bins, dcc_xmls. Do we need similar files for my new sensor driver? How do I generate those?
No need to worry about it if DCC is not supported by the sensor.
4) How can we support UYVY capture? we see data-format is defined in IssSensor_CreateParams. In the case of the AR0820, TIVX_RAW_IMAGE_16_BIT is defined as a pixel container. we believe we can use the same definition for UYVY as 2 bytes-per-pixel are needed, right? But, where do we need to define the pixel format (UYVY / Bayer)?
This can get mesy on 6.02. However some customers have been able to do it. Please refer to the post https://e2e.ti.com/support/processors/f/791/p/895684/3311768#3311768
Strongly recommend switching to 7.1
5) We have been using app_single_cam and app_multi_cam (from Vision Apps) for testing. There is a line in the code:
vx_uint32 num_bytes_per_pixel = 2; /*Mayank : Hardcoded to 12b Unpacked format*/
Does it mean that those apps only work for RAW12? Can they capture UYVY or a different format with this application?
Yes, in 6.02 those apps are tested only with RAW12. The code you are referring to is for debugging i.e. saving RAW images to file system. It does not impact capture & display. But you are right that these apps in 6.02 will not work for UYVY without modifications.
This has been addressed in 7.1
6) From the tests they see that capture works only if the 3840x2160 resolution is defined for the driver. So, the driver didn't work if defined 1920x1080 resolution and they also tested the same for the AR0820 driver and it didn't work. I suspect that the resolution is hardcoded somewhere. Do you have any ideas about what the problem could be? Is it related to Imaging or Vision Apps?
There is no hardcoding. The resolution comes from sensor driver definition
#define AR0820_OUT_WIDTH (3840)
#define AR0820_OUT_HEIGHT (2160)
These settings will be used throughout for buffer allocation, CSI2 config etc.
It is expected that customer will change the sensor settings when the resolution change is desired.
7) When they use 1920x1080 resolution, the driver seems to load fine when they start app_single_cam, however it seems to be sucked on vxGraphParameterDequeueDoneRef(). Errors cannot be seen, but capture doesn't work. Can debug be enabled from OpenVX. Is there any trace or tool to debug capture issues?
Has the customer verified that CSI2 packets are being received on J7 pins? Do they have any waveforms they can share?
Regards,
Mayank
Thanks, Mayank.
Can you confirm that YUV422 capture was enabled in SDK v7.1, and NOT in SDK v7.0.
Thanks,
--Gunter
7.0 supports only at low level driver. 7.1 has YUV capture support in OpenVX node, sensor driver and demo applications.