This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Explanation of Video Interface

Hi,

Is there a document available other then the demos that explains what needs to be done to write some streaming capabilities.  The examples are using specific functionality to Davinci and I would like to know what the function arguments, etc mean.

Is there a location for this where I can take a look at?

Thanks

 

  • Which platform are you working with (DM355, DM644X, DM6467, DM643X...), as you can imagine, our DaVinci family includes many parts, some of which are ARM only, DSP only or combination of both and therefore, software has slight variations among platforms.

    That said, if you are referring to the examples in the demos directory, they primarily use two interfaces

    1) One to talk to the drivers which ususally are thru standardized Linux APIs (e.g. V4L2 and FBDev for video); these standards define common functions such as open, close, read, write... as well as specific requests (or ioctls since these are used thru common ioctl function) specific the the area the API is serving.  The documentations folder in the root Linux directory addresses these APIs.  Additionally, since some hardware features such as resizer have not been standardized into a Linux API yet, we include documentation for fetures not addresses by Linux Standards under PSP folder found in dvsdk root directory (e.g. dvsdk_1_30_00_41/PSP_XX_XX_XX_XX)

    2) The other interface the demos use is the codec engine interface to exercise the multimedia codecs (MPEG2, MPEG4, AAC...); documentation for this interface can be found under the codec engine directory (e.g. dvsdk_1_30_00_41/codec_engine_xx_xx_xx_xx )

    I hope this helps.

  • I currently am using the DM355 Eval Board.  I will take a look at these standards, thanks!