Hi,all!
I have the ics 4.0.3 and omap4430. I want to use gstreamer and gst-openmax.The command is "gst-launch omx_camera mode=1 device=secondary name=cam cam.src ! queue ! surfaceflingersink cam.vidsrc ! queue ! omx_h264enc profile=8 bitrate=1024000 ! filesink location=/mnt/sdcard/ttt --gst-debug=omx:5". I have read the stagefright code in the ics and the stagefright use the OMX_AllocateBuffer and copy the data from srcBuffer by memcpy(OMXCodec::drainInputBuffer(BufferInfo *info)).I find that the it only copy a little of data from srcBuffer to omx buffer header (memcpy((uint8_t *)info->mData + offset,(const uint8_t *)srcBuffer->data()+ srcBuffer->range_offset(),srcBuffer->range_length());),the most is 136 bytes. the camera and encoder function are ok and the camera preview is ok too.
I find the OMXNodeInstance::storeMetaDataInBuffers and struct StoreMetaDataInBuffersParams .this means explain that :
// A pointer to this struct is passed to OMX_SetParameter() when the extension
// index "OMX.google.android.index.storeMetaDataInBuffers"
// is given.
//
// When meta data is stored in the video buffers passed between OMX clients
// and OMX components, interpretation of the buffer data is up to the
// buffer receiver, and the data may or may not be the actual video data, but
// some information helpful for the receiver to locate the actual data.
// The buffer receiver thus needs to know how to interpret what is stored
// in these buffers, with mechanisms pre-determined externally. How to
// interpret the meta data is outside of the scope of this method.
//
// Currently, this is specifically used to pass meta data from video source
// (camera component, for instance) to video encoder to avoid memcpying of
// input video frame data. To do this, bStoreMetaDta is set to OMX_TRUE.
// If bStoreMetaData is set to false, real YUV frame data will be stored
// in the buffers. In addition, if no OMX_SetParameter() call is made
// with the corresponding extension index, real YUV data is stored
// in the buffers.
struct StoreMetaDataInBuffersParams {
OMX_U32 nSize;
OMX_VERSIONTYPE nVersion;
OMX_U32 nPortIndex;
OMX_BOOL bStoreMetaData;
};
I set the parameter in the gst-openmax omx_setup() in gstomx_h264enc.c by storeMetaDataInBuffers function and it return OMX_ErrorNone.So i think it set success.so i have some questions.
1 ) Dose it affect camera that i set this parameter and the data from camera app have the address about the yuv ?
2) what data produceed by camera after set this parameter and what is the differences in data before set and after?
3) If the data is same before and after set paramter ,how the omx codec get the address about data only copy a few of bytes?
4) The camera produce data and transfer data to omx_h264enc plugin in the gstreamer and i want to use it as the same as stagefright .so i use the OMX_AllocateBuffer function in the g_omx_port_allocate_buffers (gstomx_port.c) after set the StoreMetaDataInBuffersParams ,but i can not get the same data as stagefright.so what should i do ?
5) In the gst-openmax ,the OMX_AllocateBuffer will allocate buffer and OMX_UseBuffer use the buffer pass by other component.Dose the two functions have other differences?
Thank you very much!
BR
Aaron