Hello,
I'd like to render graphics from Qt on top of video from GStreamer on an OMAP4460. I've previously done this on earlier OMAPs and DAVINCI's but am struggling to understand how best this is achieved on OMAP4 with Ubuntu.
I understand that there are three ways of doing this, can you confirm/correct:
1) Using X11, in which case pvr/sgx is used by X for rendering, resizing, compositing (rather than the DSS)
2) Without X, in which case the Qt application could render via KMS and a gstreamer pipeline can render via a kmssink. I understand that CRTCs are connected to the DSS overlays, in which case the 'connector' parameter in the kmssink could specific one overlay, and the Qt application another. The DRM/KMS api can be used to help with blending/resizing? Are there any examples of this? (thus using the DSS)
3) Without omapdrm. In this case a gstreamer pipeline can be used to render video via v4l2sink to video0, and the Qt application can render to fbdev fb0. The existing omapfb/oamp_vout ioctls/sysfs controls can be used to scale the DSS video overlay and blend between them.
I'd like to use the third approach, if this is correct then:
1) Does this work?
2) Are there any examples available that demonstrate this?
3) Are there any performance issues in connecting omx_camera to v4l2sink?
Thanks,
Andrew Murray