This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Linux/DRA745: how video data render

Part Number: DRA745
Other Parts Discussed in Thread: DRA75

Tool/software: Linux

hi,

our project base on 6am1.2.

I am wondering about how video data render into display panel at platform DRA75.

perhaps like this: ddr buffer(video data stored here)-->surfaceflinger-->hwc(to decide compose via gpu or dss)-->dss vidx pipeline-->dss wb pipeline-->framebuffer-->Display Encoders-->panel.

my question is:

1. how hwc(or application) communicate with dss driver, which directory these communication code locate at? more detail more better.

2. I want to bypass surfaceflinger and pass video data to dss directly to reduce rendering time. is this idea possibly?

BR,

Sinkeu

  • Hi Sinkeu,

    -1-
    If you are looking for the TI HWC implemenation for DRA7x, you can find it at [1]. The version info can be found in the release manifest [2]. As you noted, HWC will determine whether to compose a layer through GPU or DSS. Not that DSS composition is completely enabled only in 6AM1.3 release. If you wish to use DSS composition, you would need to migrate to 6AM1.3 release.

    -2-
    The standard way to post content onto display is through using Android activity life cycles - either through Java or NDK. Bypassing SurfaceFlinger would mean breaking away from framework API's. We cannot suggest any recommendation towards that end.

    For standalone native applications, you could refer to omapdrmtest [3] on how drm API's are used to post content to display

    Thanks,
    Gowtham

    [1]: git.omapzoom.org/
    [2]: git.omapzoom.org/
    [3]: git.ti.com/.../master