This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

DM8168 - Need clarification on V4l2 Capture support.

Other Parts Discussed in Thread: AM3517, AM3505

Hi,

 

I was evaluating the ezsdk version to be used in our application and I am bit confused in case of video capture support. I prefer to have V4l2 based capture and learn that v4l2 capture support in ezsdk will be available in Dec-2011.

But after reading this discussion and wiki page seems the support is already present?

Can anyone please clarify and let me know if there is any PSP or ezsdk release available with V4l2 capture support on DM8168?

 

Regards,

Krunal

  • Hi,

    V4L2 capture driver is supported as a part from PSP04.00.01.13_patch2 and PSP04.01.00.06_patch2 release. Currently none of the open source applications like Gstreamer are ported on V4L2 capture driver. But if you have any of the applications available you can use it for driving V4L2 capture driver. For more details please refer to documentation at

    http://processors.wiki.ti.com/index.php/DM81xx_AM38xx_Video_Capture_Driver_User_Guide and http://processors.wiki.ti.com/index.php/DM81xx_AM38xx_Adding_External_Decoders_to_V4L2_Capture_Driver

    There is plan to support Gstreamer on V4L2 capture driver.

    Regards,

    Hardik Shah

  • Hi Hardik,

     

    Thanks for the response. It really valuable information for us. We are planning for a android based product on DM8168 platform.

    It would be really helpful if you can help us to resolve few more doubts listed here: http://e2e.ti.com/support/embedded/android/f/509/t/150857.aspx

     

    Regards,

    Krunal Patil

  • Hi Hardik,

     

    Thanks for the response. It is really a valuable information for us. We are planning for a android based product on DM8168 platform.

    It would be really helpful if you can help us to resolve few more doubts listed here: http://e2e.ti.com/support/embedded/android/f/509/t/150857.aspx

     

    Regards,

    Krunal Patil

  • Hi Hardik,

    Is there any plan to support V4l2 based Resizer/Chroma conversion/Deinterlacer (i.e. VFPC component of OMX)? If yes, By when can we expect this particular feature?

    Please note, we  don't want to use on the fly resizer supported in V4l2 capture driver.

    Our plan is to have PIP. So we want to capture single 1080P stream and get two stream out of it.

    1) 1080P

    2) D1 (this will be resized version of captured 1080P and overlayed on above 1080P stream)

     

    Regards,

    Krunal Patil

  • Hi,

    As of now we dont have plans for getting any memory to memory drivers under any Linux architecture. You need to use OpenMax.Using OpenMax VFPC you can achieve all of above.

     

    Regards,

    Hardik Shah

  • Hi Hardik,

    Thanks. It makes things clear that VFPC componens will not be part of V4L2 driver.

    However, we have one question from the V4L2 user guide available at http://processors.wiki.ti.com/index.php/DM81xx_AM38xx_Video_Capture_Driver_User_Guide

    This guide mentions that scale down and crop feature is available. (Supports scaling and cropping for YUV formats (downscaling only) )

    Would this capture driver perform scaling/cropping on the fly ?

    Can you please confirm following ?

    1) V4L2 capture driver will support down scaling and cropping operation on the fly.

    2) V4L2 display driver will support down scaling and cropping operation on the fly.

    Regards,
    Sweta

  • Sweta,

    V4l2 capture does not support croping and scaling on the fly.

    Current V4l2 display does not support scalig feature, but it does support cropping on the fly.

    regards,

    yihe

  • Hi Sweta/Yihe,

    V4L2 capture can do on the fly scaling and cropping. On the fly scaling is limited to only downscaling. Further scaling can be done only for YUV formats and not RGB formats.

     

    Regards,

    Hardik Shah

  • Hi Hardik,

    Thanks for confirmation. Can you please let us know the earliest possible date to receive this V4L2 capture driver for further exploration ?

    Regards,
    Sweta

  • Hi,

    What future exploration you want to do. I dont have any dates for any more features as of now.

     

     

    Regards,

    Hardik Shah

  • Hardik

    are you sure on the cropping? in the capture driver, if the stream is on, cropping ioctl return error.

    Regards,

    yihe

     

  • Sorry Yihe/Sweta,

    I got confused with inline scaling and on the fly scaling. Inline scaling and cropping is supported and on the fly scaling and cropping is not supported.

     

    Regards,

    Hardik Shah

  • Hi Hardik/Yihe,

    Can you please elaborate what do you exactly mean by inline scaling and cropping ?

    Let me give you use case of what exactly I am looking from the V4L2 capture and display drivers from ezsdk 5.03.01.15. I know I can use OpenMax VFPC components to do the same but it will be helpful with our product design, if this can be achieved from V4L2 driver.

    V4L2 capture driver

    - Camera will provide 1080P_30 output, V4L2 capture driver should get 1080P_30 as input and provide 960x1080 output (horizontal scaled down frame) in dequeue call.

    V4L2 display driver & frame buffer driver

    - We want to display 1080P_30 video on /dev/video0. V4L2 frame buffer driver (/dev/fb0 and /dev/fb1) will get 1920x1080 as video frame i/p, will crop two 960x1080 from 1920x1080 as per given co-ordinates, both /dev/fb0 and /dev/fb1 will scale down respective 960x1080 frames to respective 360x480 frames. Both /dev/fb0 and /dev/fb1 windows will be overlaid on /dev/video0 window at two different co-ordinates.

    Kindly confirm whether this is feasible with current V4L2 implementation or not.

    Thanks,

    Sweta

  • Hi Sweta,

    Few points,

    On-the-fly Scaling --> You can change scaling ratio without streaming off driver.

    inline Scaling --> Scale incoming image to your desired resolution without going through memory.  This is what you want from your mail above. for capture.

     

    Your second question doesnt seem to be clear. Let me explain

    V4L2 display does not support any scaling. You need to use VFPC to do scaling or display already captured image and scaling done by capture.  Further V4L2 display can only take YUV formats and Fbdev can take only RGB format so i am not sure how are you going to scale 960X480 to 360X480 using FBDEV. can you please explain in detail data patch. Its all confusing between V4L2 display and Frame buffer driver.

     

    Regards,

    Hardik Shah

     

     

  • Hi Hardik,

    Thanks for clarification. I understood your concern regarding how can I generate RGB data from processed YUV buffer to push to frame buffer driver.

    With V4L2 display driver's inability to scale the video, VFPC becomes mandatory to show scaled video at display side.

    Evenif it seems to be obvious from tech ref, we want to confirm that color space conversion is not available (YUV to RGB). Is that true ?

    Regards,

    Sweta

  • Hi,

    You are right there is not CSC for grpx and video pipelines. Grpx always takes RGB and Video pipeline always takes YUV. He have plan of developing V4L2 display driver for aux and main paths, which include scalar and deInterlacer. Timeframe is roughly around 2Q12 or 3Q12 I am not very sure though. If time frame doesnt matches than you will have to use VFPC for any scaling you need.

    Regards,

    Hardik Shah

  • Hi,

    Thanks for update on V4L2 display driver, this time line is too far for us to meet product requirement, so we will use VFPC.

    I got very confused after reading few other threads regarding color space conversion.

    Brijesh has mentioned that color space conversion YUV<->RGB is possible on DM816x.

    Can you please quickly reconfirm after reading below e2e thread ?

    http://e2e.ti.com/support/embedded/linux/f/354/t/153133.aspx#554918

    Thanks,
    Sweta

  • Hi,

    I saw that. On capture side we have CSC. Two video pipelines can only take YUV while three grpx pipelines can only take RGB.

    Regards,

    Hardik Shah

  • Hi Hardik,

    Is there any document which describes YUV<->RGB conversion for Netra? If yes, kindly share the same asap.

    Brijesh has explicitly mentioned that "Yes, it does have YUV to RGB support on capture and display path."

    If that is correct, then what I understand is, we should be able to give YUV data to frame buffer display driver, and hardware will take care of  YUV to RGB conversion in hardware.

    We have used this functionality in Sitara series of processors (AM3517 and AM3505) in past.

    We are emphasizing more on this feature because support of this functionality will drastically reduce unnecessary efforts from our product.

    Thanks,
    Sweta

  • Hi,

    CSC is possible at the output of display, where you can get YUV or RGB out from Venc, bit still input data is YUV to video planes and RGB to graphics planes. I dont have any such docuemnt what you are asking. VFPC is the only way for you to go as of now.

    Regards,

    Hardik Shah

  • Hi Hardik,

    Ok.

    Let me summarize this discussion thread.

    What I understand from color space conversion at capture and display side, is, video decoder chip (ie.SII9135) and video encoder chip (ie. SII9134) takes care of RGB/YUV data in/out for capture and physical display purpose only.

    Netra processor doesn't support  internal color space conversion in any way.

    Thanks,

    Sweta

  • Its done by Venc of Netra. But  not at pipeline level but at Venc level. So external chip like SII9135 or SII9134 can receive RGB or YUV.

     

    Regards,

    Hardik Shah

  • Sweta,

     

    Hardik is right here, even if csc is available on display side, input of the pipe is always YUV. This CSC is available for supporting different external devices.

     

    this is what happens when we asks questions like whether csc is available on display/capture. Please set the context in the question correctly.

     

    thanks,

    brijesh Jadav