to make thing simple, regard media player as an example.
it cantains a GUI which have server window, each window has a live video in it.
GUI is in graphic pipeline, which is QT output
video is in vedio pipeline, in my opinion, which is decoded from network stream or came from input video capture.
how to design it with EZSDK/OpenMax?
ps: in the Video conference reference design, the GUI is a static picture. which is diffrent , in my opinion.