This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Stream screen image to a client unit - how to get accelaration from HW

We are trying to stream screen image from server unit to a client unit.  We have few questions:

1) Is there a composite frame buffer in memory?  If yes, how do we (or can we) access it?  (by composite frame buffer we mean a contiguous area of memory that is mapped one to one with the pixels of the display.) 

2) Could we get access to the low-level code that supports the XServer function, XGetImage()?

3) Please confirm that the only input format for the hardware H.264 encode is YUV.  We would like RGB.

4) Can hardware H.264 encoder work with an input resolution of 1024x768?

5) Is there a hardware component (OMX) that can convert an RGB buffer to a YUV buffer?  If not, is there an optimized software component?

6) Our goal is to use the H.264 hardware encoder to encode the contents of the composite frame buffer.  We would like to do this using as little of the main CPU as possible.  What is the best route to take?

  • We are trying to stream screen image from server unit to a client unit.  We have few questions:

    1) Is there a composite frame buffer in memory?  If yes, how do we (or can we) access it?  (by composite frame buffer we mean a contiguous area of memory that is mapped one to one with the pixels of the display.) 

    I assume you are using the EZSDK on DM8148.  You would have to hack the linux kernel display driver to capture the composited video buffer before it is sent out to the display. You would use this buffer pointer(or perhaps you will need to copy this buffer) to send to the "encode" OMX component or you could use the gst encoder.  And you would stream the encoded frame using gst or RTP OMx component.

    You can contact Consilient Tech at http://consilient-tech.com/

    2) Could we get access to the low-level code that supports the XServer function, XGetImage()?

    We dont support X11 in EZSDK

    3) Please confirm that the only input format for the hardware H.264 encode is YUV.  We would like RGB.

    Right now we only support YUV format for hardware H264.  No plans to support RGB input.  3P like Consilient Tech and others have written Color Space Conversion (CSC) routines that convert output of H264 frm YUV to RGB.  Please contact them.

    4) Can hardware H.264 encoder work with an input resolution of 1024x768?

    Yes.

    5) Is there a hardware component (OMX) that can convert an RGB buffer to a YUV buffer?  If not, is there an optimized software component?

    We dont have.  But you could use gstreamer to do this conversion, though this would be done in software and may not be fast enough. One of the 3P mentioned above would be able to do this acceleration in DSP.

    6) Our goal is to use the H.264 hardware encoder to encode the contents of the composite frame buffer.  We would like to do this using as little of the main CPU as possible.  What is the best route to take?

    You would hack into the kernel display driver to capture the video being sent to the display as explained above in (1).