This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Dual frame buffer on Blaze tablet

Hi,

I want to achieve dual frame buffer on blaze tablet.ie i want a video to be played on HDMI and i want to browse simultaneously on my tablet.Is it fine if i just enable the docking mode in Android or should i do any driver specific changes.If not should i set any other properties in Android so that whatever i mentioned above could be achieved.I am using this release for acheiving this http://www.omappedia.com/wiki/4AI.1.4-P1_OMAP4_Icecream_Sandwich_Release_Notes.

Regards

Haran

  • Hi Haran,

    In the case of docking mode, the ICS HWC would first check all the layers whose gralloc bit is set as EXTERNAL DISPLAY and then pick the layer with the largest resolution to redirect it to HDMI TV and the other layers would be rendered on primary LCD.

    As long as you satisfy the above criteria, it should meet your requirements.

    Thanks & Best Regards,

    Venkat

  • Hi Venkat,

    Can you please tell me how to set the docking mode.Is this command enough or should i give any other command like disabling the cloning mode and other things and if i only enable the docking mode will the above condition be satisfied?

    setprop persist.hwc.docking.enabled "1"

    Regards

    Haran

  • Hi Haran,

    As per my understanding, the docking mode should be the default mode. In any case, when you do the following steps, it will mirror only video layers to HDMI and all GFX layers will stay on the LCD.

    stop

    setprop persist.hwc.docking.enabled "1"

    start

    I would recommend to disable cloning as well and check it out and share your observations.

    Thanks & Best Regards,

    Venkat

  • Hi Venkat,

    Can you pls give a brief explaination about these properties as to what exactly each will do because there is no documentation at all explaining these properties so that we will have a good hold on these properties and we can play around with it.

    Thanks Haran

  • Hi,

    Can you please show me a video demo of blaze tablet using dual frame buffers.I was not able to find one anywhere.And should we do anything at all with the overlays or just setting these properties is enough because as far as i have noticed there is not significant changes when i set these properties at all.Can i try with the prebuilts given in omappedia  with these properties set.Pls confirm.

    Regards

    Haran

  • Hi Venkat,

    I tried whatever you told ie i did this

    stop

    setprop persist.hwc.docking.enabled "1"

    setprop persist.hwc.cloning.enabled "0"

    start

    and then i tried to play a 1080p video but then the video gets played both on lcd as well as hdmi.I could not achieve what i wanted to.For HDMI it actually selects a resolution of 1920*1080 but then the video gets played on both lcd and HDMI and it is not behaving as you told Venkat.Anything else that you want me to try out?

    Regards

    Haran

  • Hi Haran,

    Is is possible for you to modify the Hardware Composer layer to enable only Video layers to be redirected to HDMI TV. All other layers need to be on LCD. You can check for GRALLOC USAGE BIT FOR EXTERNAL DISPLAY which need to be set for only video layers.

    Thanks & Best Regards,

    Venkat

  • Hi Haran,

    the HWC can mirror a visible layer from your primary display to the secondary or clone your display.

    The problem with launching a Browser on the first display and watch a video same time on the second
    display is that your media player becomes a back task fully covered by the Browser window - and as such, it 
    will be excluded from the composition by the SurfaceFlinger. 

    In case you don't want to do heavy changes in the Android framework itself, you may consider to play the
    video on the second display outside the Android framework.

    To achieve this you may need to disable the Android HWC, TI Stagefright, Kernel DSS Composer and use
    TI's Gstreamer for playback.

    Br,

    Zoltan

  • Hi Zoltan,

    Thanks for the reply. I have gone through the Gstreamer and downloaded thesource. and compiled as per instructions given. Here my query is do I have to compile Gstreamer specific to ARM? how can i integrate this stack into the android.Will gstreamer not use the Android framework for playing any content.So are you saying that always when a content is played through gstreamer it will always use the HDMI to play it and not the LCD?

    Regards,

    Haran.

  • For next comment I haven't actually made this in practice but it must be possible.

    In the restriction that using only one VideoView object to play a video because of the restriction of ICS about using one  surface left, it is possible to use a GLView context and SGX surface to show the second video or both of them, in your case that you are using 2 screens it could be just matter to redirect to the correct one the SGX configuration and use this approach.

    This is in theory, I don't know what changes had you made.

  • Hi Manuel,

    Can you pls tel me exactly what should i do because i dint understand much of your statements.

    Regards

    Haran

  • Sure, when VideoView or MediaPlayer are used you need to set a Surface Rendered for video to play, but by default HW surface is selected, the limitation is that there is only one Surface left to be use when playing a video, that limits you to run only one instance at a time.

    When I tried to use SW renderer it is not visible for some reason, I still need to investigate this probably by playing with some of the properties, but it is not going to be in the next days, but I heard that by using a SGX surface texture as renderer it is possible to play 2 videos at the times, this by using it instead of HW renderer.

    I can think that in your changes to use 2 displays you can create a surface in any of them by using as parameter what display to use then use it for one of the instance, and then create another surface in the other display the same way and assign it to the other video player.

  • Hi Manuel,

    Thanks for the reply.Is it possible to change the hwc file in the HAL layer so that only the video layer is ported to HDMI so that a video gets rendered on the HDMI and i will be able to browse on the LCD.If so can you please give some more information on how exactly to do it.I just want a video to play on HDMI while i am able to browse on my LCD.

    Regards

    Haran

  • By the description you mention it must be docking mode where video is shown in HDMI output. To disabled it

    stop

    setprop persist.hwc.docking.enabled 0

    start

    or set it to 1 to enabled it. this property is used in hardware/ti/

    then in HDMI you must see video+UI.

    Like this information was shared before in

    http://e2e.ti.com/support/omap/f/849/p/196555/701171.aspx#701171

    then the answer you are looking could be to add it to the second framebuffer, what i can think is to assign it the other surface instead of video layer.

  • Hi Manuel,

    I have actually gone through the hwc file but there as far as my observation is concerned whatever Venkat's reply was is already satisfied in the docking mode ie only one video layer is given to the HDMI and the other layers are retained on the HDMI so what should be the other changes that have to be done in such a way that the video gets played on the HDMI and i will be abe to browse on the lcd display.

    Regards

    Haran

  • Hi Zoltan,

    I was unable to port gstreamer to Android.Can you pls guide me or tell some other way as to how to acheive it.

    Regards

    Haran

  • Hi Haran,

    Could you please let me know if you open the Gallery to playback 1080p video clip in docking mode, does it play on HDMI TV or LCD panel? If it starts playing on HDMI TV, what is the exact behavior if you launch the browser now? Does it get redirected to HDMI?

    As far as my understanding, the layer with the largest resolution gets routed to External Display in Docking mode in case of multiple layers having the EXTERNAL_DISP gralloc bit set.

    Thanks & Best Regards,

    Venkat

  • Hi Venkat,

    When i play a 1080p video it gets played on both LCD as well as HDMI in the docking mode.ie to set docking mode i do the following things

    stop

    setprop persist.hwc.docking.enabled "1"

    setprop persist.hwc.cloning.enabled "0"

    start

    Is it correct to enable the docking mode or should i do some other property settings also?

    Regards

    Haran

  • Hi Haran,

    Thanks for clarifying my queries. Let me try to check that out locally as well and get back to you on this with further updates after some experiments.

    Best Regards,

    Venkat

  • Hi Venkat,

    Sure Thanks.:)

    Regards

    Haran

  • Hi Venkat,

    Any update on this?

    Regards

    Haran

  • Hi Haran,

    Here are my observations and suggestions. I did try this scenario and it matches exactly with your observations. The video gets played back on both LCD and as well as HDMI. Not only Gallery app or any other app for that matter gets cloned on both LCD and HDMI while playing.

    To avoid cloning in general,  one way to do that is to set the layer properties of the app you want to show only on HDMI by proper checking in the HWC layer and setting of the GRALLOC_BIT_EXTERNAL_DISPLAY.

    Alternately, the zorder of the overlay1 which is showing video on LCD can be set to higher (say 3) whereas the zorder of the overlay0 can be set to '0'. This cannot be done using sysfs commands from command prompt as it gets overwritten every frame. You need to hardcode it in the hwc layer.

    The above modification would be sufficient to avoid the cloning of the app on both LCD and HDMI but not sufficient to fulfill your requirements for Video Playback on TV and Browser on LCD simultaneously.

    As you know, android does not inherently support multiple applications at the same time. That is, when another application is launched say Gallery while browser is open, the other app gets paused and goes into background.

    First you need to extend the android framework to enable support for multiple applications at the same time.

    One way of doing is to extend support for multiple frame buffer devices /dev/fb0 & /dev/fb1 with applications having ability to select the frame buffer device to render its content to. It requires major modifications for kernel, file system and GFX library.

    Another way to achieve your requirement is to run the movies from a command line application whose overlay parameters can be controlled to point to HDMI TV and continue with the other application such as browser on the LCD device. You need to be careful about resource management in this approach in case you are trying launch video playback from apps also.

    So, in summary, it does not seem to a simple feature requirement that can be achieved by turning on/off some of the android layer properties. It is more of a new feature requirement that demands for major modifications of android framework as it does not inherently support true multi tasking i.e., simultaneous updating of two different applications.

    Thanks & Best Regards,

    Venkat

  • Hi Venkat,

    Thanks for the reply.We actually are trying to follow the video playback through command line.Do you have any suggestions on which app to use to play a video?

    Regards

    Haran

  • Haran,

    You can play back a video from the command line with a command like:

    am start -n com.android.gallery3d/.app.MovieActivity -d /<path>/<movie name>

    (where you have updated the path and movie name accordingly).

    Regards,

    Gina

  • There is a command line for Stagefright too,

    ./mydroid/frameworks/base/cmds/stagefright/stagefright.cpp

    static void usage(const char *me) {
        fprintf(stderr, "usage: %s\n", me);
        fprintf(stderr, "       -h(elp)\n");
        fprintf(stderr, "       -a(udio)\n");
        fprintf(stderr, "       -n repetitions\n");
        fprintf(stderr, "       -l(ist) components\n");
        fprintf(stderr, "       -m max-number-of-frames-to-decode in each pass\n");
        fprintf(stderr, "       -b bug to reproduce\n");
        fprintf(stderr, "       -p(rofiles) dump decoder profiles supported\n");
        fprintf(stderr, "       -t(humbnail) extract video thumbnail or album art\n");
        fprintf(stderr, "       -s(oftware) prefer software codec\n");
        fprintf(stderr, "       -r(hardware) force to use hardware codec\n");
        fprintf(stderr, "       -o playback audio\n");
        fprintf(stderr, "       -w(rite) filename (write to .mp4 file)\n");
        fprintf(stderr, "       -k seek test\n");
        fprintf(stderr, "       -x display a histogram of decoding times/fps "
                        "(video only)\n");
        fprintf(stderr, "       -S allocate buffers from a surface\n");
        fprintf(stderr, "       -T allocate buffers from a surface texture\n");
        fprintf(stderr, "       -d(ump) filename (raw stream data to a file)\n");
    }

  • Hi,

    Thanks for your replies but i want a video command line player which satisfies these requirements.

    Run the movies from a command line application whose overlay parameters can be controlled to point to HDMI TV and continue with the other application such as browser on the LCD device.

    Regards

    Haran

  • Hi Haran,

    Did you try using sysfs commands to control the overlay parameters as mentioned in the link below? Check it out if these commands help you out.

    http://omappedia.org/wiki/Bootargs_for_enabling_display#Set_overlay_properties

    Thanks & Best Regards,

    Venkat

  • Hi Venkat,

    I have played around with it.What i did was i disabled overlay1 manager and when i play a video it gets displayed on hdmi but not on lcd but there will be a status bar on the lcd.But what i wanted was that i had to simultaneously browse on my lcd so it did not satisfy my requirements.Can i do anything else further in this line to achieve my requirements?

    Regards

    Haran

  • Hi Haran,

    Essentially, you should change the manager of overlay which is being used by command line application to HDMI TV. But the other GFX overlay which is rendering the Home Screen or Launcher should continue in enabled state with its manager pointing to LCD.

    Could you please share with me the following outputs when you are playing the video from command line application?

    cat /sys/devices/platform/omapdss/overlay0/*

    cat /sys/devices/platform/omapdss/overlay1/*

    cat /sys/devices/platform/omapdss/overlay2/*

    cat /sys/devices/platform/omapdss/overlay3/*

    cat /sys/devices/platform/omapdss/display0/*

    cat /sys/devices/platform/omapdss/display1/*

    cat /sys/devices/platform/omapdss/display2/*

    Thanks & Best Regards,

    Venkat

  • Hi Venkat,

    Sorry for the late reply.I have been trying some things all these days.Actually i was able to achieve Dual frame buffer on Minimal fs by utilizing mplayer so there are two different content that are pumped in the two different frame buffers so i thought to use the same mplayer for pumping data directly to the frame buffer without utilizing the android framework so that i would be able to achieve it.Now i am actually facing some problems.What's happening is when i try to use mplayer and pump data onto frame buffer 1 by doing some minor changes in the mplayer source code.I tried to play a video using mplayer after booting android but instead of drawing the contents to hdmi it is rendered on the lcd but that is also not fully rendered.A compressed video is rendered and it is visible only on the upper part where we get the time and other things may be only the first three line on the top.I suspected that this was an overlay issue and i have some doubts as well.

    Does android use frame buffer1 for rendering data on hdmi.If so then why is it that there are no changes on the hdmi when i run mplayer configured for fb1.

    Regards

    Haran

  • Hi Venkat,

    Now i am  actually able to playback a video using mplayer command line on Android on HDMI while i am browsing on the LCD but here i am facing an issue.Whenever the screen gets refreshed even though the video is playing the hdmi switches screen between the gui and the video so i thought of disabling the cloning whenever the video is played but when i disable the cloning the problem is that i was not able to see anything on the hdmi.Can you please suggest a way so that the video gets played uninterrupted on the hdmi without switching screens when i browse on my lcd.

    Regards

    Haran

  • Hi Haran,

    That is good to see you are now able to play video using mplayer from command line. I think the problem seems to be with the zorder of the overlays that are rendering the GUI and video on HDMI. One way to address is to ensure that the GFX overlay or overlay0 which is cloning GUI on to the HDMI gets lower zorder whereas the overlay that is rendering your video gets highest zorder.

    Without disabling cloning, can you check by fixing the zorders of all overlays with highest zorder for the overlay rendering video. As per my understanding, zorder  value of 3 being highest and '0' being lowest.

    Thanks & Best Regards,

    Venkat

  • Hi Venkat,

    Thanks for the reply.I actually could not figure out where exactly to hard code the zorder.Can you help me on this so that we can close this issue.

    Regards

    Haran

  • Hi Haran,

    Did you try changing the zorder using the sysfs commands? Can you also check the output of the following sysfs commands when you see the video going to background.

    cat /sys/devices/platform/omapdss/overlay0/zorder

    cat /sys/devices/platform/omapdss/overlay1/zorder

    cat /sys/devices/platform/omapdss/overlay2/zorder

    cat /sys/devices/platform/omapdss/overlay3/zorder

    To change the zorder of any overlay, you can use the sysfs command as follows:

    echo '3' > cat /sys/devices/platform/omapdss/overlay3/zorder

    echo '0' > cat /sys/devices/platform/omapdss/overlay0/zorder

    The overlay with highest zorder would be the topmost layer whereas the overlay with lowest zorder will be the bottom most layer.

    Just let me know the above information first to verify if the issue is due to zorder change. Then, we can try to hard code the zorder at the kernel code.

    Thanks & Best Regards,

    Venkat

  • Hi Haran,

    Did you try changing the zorder? Did it help in overcoming the problem?

    Please let me know if you need pointers in regards to setting zorder within the kernel code itself.

    Thanks & Best Regards,

    Venkat

  • Hi Venkat,

    Thanks for the reply.Sure i want to know where exactly to hardcode in the kernel since i tried out the way you told to  change dynamically and the value was not fixed at all ie everytime the screen refreshes the zorder is changed.

    Regards

    Haran

  • Hi Haran,

    Yes, that is expected since the hardware composer would impose default values upon screen refresh.

    You have the following two options.

    1. You can either modify the Hardware Composer hwc.c ($MYDROID/hardware/ti/omap4xxx/hwc) to restrict the zorder of the cloned overlay on HDMI to less than that of the video overlay which your command line application is using.

    2. Hard code the zorder of the overlays being used by giving highest zorder to the overlay being used by your command line application and the others lower than that. If VID2 is used by your application, VID2 overlay zorder can be set to 3 and the rest should be lower than 3.

    You can hard code these values in the configure_overlay API of the kernel code in the file manager.c (drivers/video/omap2/dss).

    You can modify the following statement with check for each plane or overlay with a specific value as per your needs. This call essentially invokes DISPC register writes for each plane/overlay attributes. Plane == 0 => GFX, Plane == 1 => VID1, Plane == 2 => VID2, Plane == 3 => VID3

    dispc_set_zorder(plane, c->zorder);

    I hope this should help you resolve your issue.

    Thanks & Best Regards,

    Venkat

  • Hi Venkat,

    Actually it did not help my cause since Android also uses VID3 so even though i increase the zorder of VID3 the screen again flickers and the output that is desired could not be achieved...:(

    Regards

    Haran

  • Hi Haran,

    Could you please tell me what overlay is being used by mplayer while you are playing video from command line? How about android? What overlays are being used by android?

    Please provide me the outputs of the following sysfs commands for further suggestions.

    cat /sys/devices/platform/omapdss/overlay0/*

    cat /sys/devices/platform/omapdss/overlay1/*

    cat /sys/devices/platform/omapdss/overlay2/*

    cat /sys/devices/platform/omapdss/overlay3/*

    One other thing you can try is to disable the overlay used for cloning directly at kernel level always.

    Thanks & Best Regards,

    Venkat

  • Hi Haran,

    Did you get a chance to collect the above sysfs command outputs?

    How about disabling the overlays in the kernel? Did it help?

    Thanks & Best Regards,

    Venkat

  • Alternatively, you could remove the hwc*.so from the device (or disable it at build time) and

    build the kernel with DSSCOMP disabled. You will get then a full control over the overlays using

    sysfs.  This works on my Pandaboard.

    Zoltan

  • Hi Venkat,

    Sorry i have not tried it out yet.I will try it out soon and reply you.

    Regards

    Haran

  • Hi Haran,

    Sure, please let me know once you have all the outputs collected.

    I guess you can even try the suggestions from Zoltan but I am not sure what happens to the primary display driven by HWC and not the command line application? I am not sure about it as I have not tried it myself though.

    Thanks & Best Regards,

    Venkat