This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Port Settings changed from H.264 Decoder on OMAP4

We are working on a player application being developed in C code and controlled through a Java application. We provide surface(native window) as a source of buffers. When we used the TI H.264 decoder through IOMX, we observed port settings changed call being called by H.264 decoder multiple times. In relation to this I had a few questions.

1. Is there a way to reduce the number of times the port settings change call is made by the decoder?

2. Also, we are providing buffers obtained through surface (native window) to decoder for outputting the decoded data. Whenever a port settings change call is seen, we need to release the buffers given to the decoder component and give a new set of buffers. Is there a way to release these Surface (native window) buffers apart from removing and adding the Surface View in the Java Layer? Is there no way of releasing buffers from native C code itself?

Any help is highly appreciated.

  • Hi Pavan,

    Could you please let me know which software release (i.e., ICS, GB Android Releases) and hardware platform (i.e., Blaze or Blaze Tablet) are you using for the development of your player application?

    Thanks & Best Regards,

    Venkat

  • Hi Pavan,

    Please see my answers for your queries below.

    1. Is there a way to reduce the number of times the port settings change call is made by the decoder?

    Answer: Port Settings Changed Event can occur due to the following parameter changes.

    a. Buffer Reallocation. It could be due to change in resolution / Buffer Count / Padding Width Height.

    b. Change in Stream Resolution

    c. Scale Index - OMX_IndexConfigCommonScale

    d. Crop Params - OMX_IndexConfigCommonOutputCrop

    e. Interlaced - OMX_TI_IndexConfigStreamInterlaceFormats

    2. Also, we are providing buffers obtained through surface (native window) to decoder for outputting the decoded data. Whenever a port settings change call is seen, we need to release the buffers given to the decoder component and give a new set of buffers. Is there a way to release these Surface (native window) buffers apart from removing and adding the Surface View in the Java Layer? Is there no way of releasing buffers from native C code itself?

    Answer:Releasing buffers is usually done using the following sequence of steps:

    a. OMX_FreeBuffer

    b. mNativeWindow->cancelBuffer

    Please let me know if you need any further clarification.

    Thanks & Best Regards,

    Venkat

  • Thanks for replying. Please find the details below.

    I am working on ICS L27.IS.2.M1 version of baseport on Blaze Tablet. I am making decoder calls through IOMX interface.

    With respect to port settings changed, please find the details below.

    Initially, I configure the input port of decoder to a particular resolution. Configuring the output port for the corresponding padded resolution results in an error. However there is no error if the output port is configured to the same resolution as the input.

    Once the first frame is given for decode, I notice that 1 of the port settings change is for the padded width and height. In case, the input stream resolution is different from the init resolution, I notice another port settings change. The second port settings seems logical but not the first one. Shouldn't there be a way to avoid the port_settings_change for padded height and width?  I think, if the output port can be configured for the padded height and width in the first go, this should avoid the first port_settings_change event.

    Additionally, when a port settings change event is received asking for a particular number of output buffers, I am not able to make a set_parameter call to change the number of output buffers than what it has asked for ( setting to a value greater than what it had asked). The set parameter call on output port once a port settings change has been done, results in an error.

    With respect to the mNativeWindow->cancel buffer , my assumption is that this actually does not free the buffers. This just results in not displaying the buffer at the output but this buffer is put back to the display queue. In future, I might get back this buffer again when a de-queqe is done. Is there no way of releasing / freeing the buffers natively??

  • Hi Pavan,

    Yes, you are expected to configure decoder to a particular resolution and upon receiving the PortSettingsEventChanged, you are supposed to handle buffer reallocation as per the padded resolution requested by decoder.

    I think your understanding on the cancelbuffers is correct and I will get back to you on the way of releasing or freeing the buffers natively.

    Please note that the handling of the Portsettingseventchanged in ICS requires some particular sequence. Ensure that you follow the guidelines mentioned below.

    In summary, it should have the following sequence of steps:

    1. Flush the buffers

    2. Port Disable

    3. Free OMX buffers

    4. Cancel Native Window Buffers

    5. Port Enable

    6. Allocate Buffers From ANativeWindow

    7. Start resending the buffers using FillThisBuffer

    Port Reconfiguration aiding Buffer Reallocation

    When ever there is a Port Reconfig with nData2 set to 0, it requires Buffer Reallocation. It could be due to change in resolution / Buffer Count / Padding Width Height.

    Incase of H264 Decode, component returns nActualCount for output port as 1. In the default behavior with ICS, Port Reconfiguration for nActualCount and Padded Width and Height is enabled by DOMX during Component Init through index OMX_TI_IndexParamUseEnhancedPortReconfig


    Following exchange between Client and Component will facilitate Buffer Reallocation

    [client Loaded->Idle] 
     Call SetParameter on OMX_IndexParamPortDefinition for output port
     Update new format.video.nFrameHeight, format.video.nFrameWidth, nBufferCountActual.
     Call GetConfig on OMX_IndexConfigCommonOutputCrop to get default crop
    
    [client]In Executing state, Client calls Empty this buffer
    [component]Port reconfig call back
    
    [client] 
     Send Command: Port disable
     Free Any allocated buffers
    [component]Event CB for Port Disable
    
    [client] 
     Call GetParameter on OMX_IndexParamPortDefinition for output port
     Update new format.video.nFrameHeight, format.video.nFrameWidth, nBufferCountActual.
     This will correspond to Actual Buffer requirements 
     Call GetConfig on OMX_IndexConfigCommonOutputCrop to get default crop
    
     Send Command: Port Enable
     Allocate Buffers as per new requirements and call Use/Allocate Buffer
    
    [component]Event CB for Port Enable
    

    In case of H264 Decode for a QCIF[176 *144] stream with 1 Reference Frame , value of PortDefination structure

    • Before Port Reconfig
      OMX-OPD::format.video.nFrameWidth 176
      OMX-OPD::format.video.nFrameHeight 144
      OMX-OPD::format.video.nStride 256
      OMX-OPD::nBufferSize 92160
      OMX-OPD::nBufferCountMin 1
      OMX-OPD::nBufferCountActual 1
    

    • After Port Reconfig
      OMX-OPD::format.video.nFrameWidth 256
      OMX-OPD::format.video.nFrameHeight 240
      OMX-OPD::nBufferSize 92160
      OMX-OPD::nBufferCountMin 3
      OMX-OPD::nBufferCountActual 3
    

    [edit] Port Reconfiguration: Change in Stream Resolution

    Same as usual Port Reconfiguration. Component limitation: will not explicitly trigger a Port Reconfiguration for change in crop in between. Client should call Get Config OMX_IndexConfigCommonOutputCrop during Port Reconfiguration for Buffer allocation.

    [edit] Port Reconfiguration: other categories

    When nData2 is not set to 0, it has the following interpretations. Client has to use the information for rendering / scaling etc.

    1. Scale Index - OMX_IndexConfigCommonScale
    2. Crop Params - OMX_IndexConfigCommonOutputCrop
    3. Interlaced - OMX_TI_IndexConfigStreamInterlaceFormats

    Illustration of Change in Crop Dimensions due to padding requirement.

    [component] Port settings change: nData2 = OMX_IndexConfigCommonOutputCrop
    [client] Query Getconfig for OMX_IndexConfigCommonOutputCrop. Use this for configuring dsscomp/file write
    

    Thanks & Best Regards,

    Venkat

    switch (eEvent) {
        case OMX_EventPortSettingsChanged:
          SYSLOG(LOG_DEBUG | LOG_H264D,"Component OMX_EventPortSettingsChanged port:%d index:0x%x %s"
                 ,(int)nData1,(int)nData2,OMX_Index2char((OMX_INDEXTYPE) nData2));

          if (nData1 != 1) {
            SYSLOG(LOG_DEBUG | LOG_H264D,"OMX_EventPortSettingsChanged on none output port:%d",(int)nData1);
            break;
          }

          if (nData2 == 0 ) {
            if(nData2 == OMX_IndexConfigCommonOutputCrop)
            {
              SYSLOG(LOG_DEBUG | LOG_H264D, "OMX_IndexConfigCommonOutputCrop, setting updateCrop");
            }
            else if(nData2 == OMX_IndexConfigCommonScale)
            {
              SYSLOG(LOG_DEBUG | LOG_H264D, "OMX_IndexConfigCommonScale, setting updateScale");
            }
            else if(nData2 == OMX_TI_IndexConfigStreamInterlaceFormats)
            {
              SYSLOG(LOG_DEBUG | LOG_H264D, "OMX_TI_IndexConfigStreamInterlaceFormats, setting updateInterlace");
            }
            else if (nData2 == 0) {

              SYSLOG(LOG_DEBUG | LOG_H264D,"RECONFIGURING on change INDEX 0 -- called to flush buffers");

              processing = 0;
              reconfiguring = 1;

              // disable port
              // disable output port
              err = OMX_SendCommand(videoDecOmxComponent, OMX_CommandFlush, 1, NULL);
              if (err != OMX_ErrorNone) {
                SYSLOG(LOG_ERR,"Error in SendCommand()-OMX_CommandFlush");
              }
            }
            else {
              SYSLOG(LOG_ERROR | LOG_H264D, "unknown OUTPUT PORT change event index:0x%x",(int)nData2);
            }
          }
          break;
        case OMX_EventResourcesAcquired:
          SYSLOG(LOG_DEBUG | LOG_H264D,"Component OMX_EventResourcesAquired");
          break;
        case OMX_EventBufferFlag:
          SYSLOG(LOG_DEBUG | LOG_H264D,"Component OMX_EventBufferFlag");
          break;
        case OMX_EventCmdComplete:
          if (nData1 == OMX_CommandPortDisable) {
            if (nData2 == OMX_DirInput) {
              SYSLOG(LOG_DEBUG | LOG_H264D, "Component OMX_EventCmdComplete OMX_CommandPortDisable OMX_DirInput");
            }
            if (nData2 == OMX_DirOutput) {
              SYSLOG(LOG_DEBUG | LOG_H264D, "Component OMX_EventCmdComplete OMX_CommandPortDisable OMX_DirOutput");

              if (reconfiguring == 1) {
                SYSLOG(LOG_ERR,"Output port disabled while in reconfigure mode");

                // initiating re-enable (won't  be done until buffers allocated)
                SYSLOG(LOG_ERR,"calling OMX_SendCommand PortEnable");
                err = OMX_SendCommand(videoDecOmxComponent, OMX_CommandPortEnable, 1, NULL);
                if (err != OMX_ErrorNone) {
                  SYSLOG(LOG_ERR,"Error in SendCommand()-OMX_CommandPortEnable:");
                }

                // allocate new output buffers again
                allocateOutputBuffersFromNativeWindow();
              }
            }
          }
          else if (nData1 == OMX_CommandStateSet) {
            Mutex::Autolock lock(stateChangeMutex);
            mState = (OMX_STATETYPE)nData2;
            if (waitForStateChange) {
              waitForStateChange = 0;
              stateWait.signal();
            }
          }
          else if (nData1 == OMX_CommandFlush) {
            SYSLOG(LOG_DEBUG | LOG_H264D, "Component OMX_EventCmdComplete OMX_CommandFlush port:%d\n",(int)nData2);

            if (reconfiguring == 1) {
              err = OMX_SendCommand(videoDecOmxComponent, OMX_CommandPortDisable, 1, NULL);
              if (err != OMX_ErrorNone) {
                SYSLOG(LOG_ERR,"Error in SendCommand()-OMX_CommandPortDisable:");
              }

              // free omx buffers
              for (int i=0; i <(int)tOutPortDef.nBufferCountActual; i++) {
                SYSLOG(LOG_ERR,"trying to free output port buffer %d %p",i,pOutputBufferHeaders[i]);
                if (pOutputBufferHeaders[i]) {
                  //            err = OMX_FreeBuffer(pHandle,OUTPUT_PORT,pOutputBufferHeaders[i]);
                  err = OMX_FreeBuffer(videoDecOmxComponent,OUTPUT_PORT,pOutputBufferHeaders[i]);
                  if( (err != OMX_ErrorNone)) {
                    SYSLOG(LOG_ERR,"Free Buffer for Output Port buffer:%d failed:%s",i,OMX_Error2Str(err));
                  }
                  else {
                    SYSLOG(LOG_ERR,"Free Buffer for Output Port buffer:%d done",i);
                  }
                }
              }

              // cancel native window buffers
              int retVal;
              for (int i=0; i<(int)tOutPortDef.nBufferCountActual; i++) {
                SYSLOG(LOG_ERR,"trying to cancel output port native buffer %d",i);
                retVal = mNativeWindow->cancelBuffer( mNativeWindow.get(), decoderDataBlock[i].graphicBuffer.get());
                if (retVal != 0) {
                  SYSLOG(LOG_ERR,"cancelBuffer failed w/ error 0x%08x", retVal);
                }
                else {
                  SYSLOG(LOG_ERR,"cancelBuffer for Output Port buffer:%d done",i);
                }
              }
            }
          }
          else if (nData1 == OMX_CommandPortEnable) {
            SYSLOG(LOG_DEBUG | LOG_H264D, "Component OMX_EventCmdComplete OMX_CommandPortEnable port:%d\n",(int)nData2);

            if (reconfiguring == 1) {
              SYSLOG(LOG_DEBUG | LOG_H264D, "PORT ENABLE completed while in reconfigure mode");

              reconfiguring = 0;
              processing = 1;  
              mBufferOffset = 0; // force cropping again

              // probably need to kick start the omx again by resending buffers
              OMX_BUFFERHEADERTYPE* header;
              for (int i=0; i<(int)tOutPortDef.nBufferCountActual; i++) {
                header = pOutputBufferHeaders[i];

                header->nFilledLen = 0;
                header->nOffset = 0;
                header->nFlags = 0;

               if (i < (int)(tOutPortDef.nBufferCountActual - 2)) {
                  err = OMX_FillThisBuffer(videoDecOmxComponent, header);
                  if (err != OMX_ErrorNone) {
                    SYSLOG(LOG_ERR,"GetState failed:%s\n",OMX_Error2Str(err));
                  }
                  else {
                    SYSLOG(LOG_DEBUG | LOG_H264D,"send fill this buffer done:%d %p %p\n",i,header,header->pBuffer);
                  }
                }
                else {
                  Mutex::Autolock autoLock(outputBufferQueueMutex);
                  outputBufferQueue.push_back(header);
                }
              }
            }
          }
          else {
            SYSLOG(LOG_DEBUG | LOG_H264D, "Component OMX_EventCmdComplete command:%d\n",(int)nData1);
          }
          break;

        case OMX_EventError:
          errorType = (OMX_ERRORTYPE) nData1;
          SYSLOG(LOG_DEBUG | LOG_H264D, "Component OMX_EventError error:%s",OMX_Error2Str(errorType));
          break;

        case OMX_EventMax:
          SYSLOG(LOG_DEBUG | LOG_H264D, "Component OMX_EventMax");
          break;

        case OMX_EventMark:
          SYSLOG(LOG_DEBUG | LOG_H264D, "Component OMX_EventMark");
          break;

        default:
          break;
      }

  • Thanks Venkat.

    I am actually following the same sequence of steps to handle port reconfiguration. However I wanted some clarity on these issues:

    Issue 1
    [client Loaded->Idle]

    SET Parameter - OMX_IndexParamPortDefinition for input port(format.video.nFrameHeight, format.video.nFrameWidth, nBufferCountActual)
    Results in Success
    SET Parameter - OMX_IndexParamPortDefinition for output port(format.video.nFrameHeight, format.video.nFrameWidth, nBufferCountActual)
    Results in Success

    where as

    SET Parameter - OMX_IndexParamPortDefinition for input port(format.video.nFrameHeight, format.video.nFrameWidth, nBufferCountActual)
    Results in Success
    SET Parameter - OMX_IndexParamPortDefinition for output port(format.video.nFrameHeight + 96, (format.video.nFrameWidth + (2 * 32)+ 127) & 0xFFFFFF80, nBufferCountActual)
    Results in Error

    Why does configuring the output port with padded height and width result in error? If this configuration was accepted, then the need for atleast 1 port reconfiguration call would have reduced, right?

    Issue 2

    When port-reconfig is in progress.

    GET Parameter - OMX_IndexParamPortDefinition for output port(format.video.nFrameHeight, format.video.nFrameWidth, nBufferCountActual)
    Results in Success

    nBufferCountActual += 4
    SET Parameter - OMX_IndexParamPortDefinition for output port(format.video.nFrameHeight, format.video.nFrameWidth, nBufferCountActual)
    Results in Error

    Actually, I would like to allocate more buffers than the minimal requirement of decoder. But this failure during port-reconfiguration is not allowing me to do this. Is there a work-around for this problem? I see that the Set_parameter call doesn’t fail if I am accessing decoder through OMX interface directly instead of IOMX interface. This seems peculiar to the usage through IOMX.

  • Hi Pavan,

    As far as Issue 1 failure, as per my understanding, we generally set the output port width and height parameters without any padding during initial decoder configuration as we are not exactly aware of how much padding would be required or what is the exact resolution of the input encoded stream.

    Upon port reconfiguration event only, within the allocate buffers API, you can enquire the expected padded width and height using the GetParameter API and set it accordingly using SetParameter API.

    Coming to Issue 2, it looks quite strange behavior with IOMX interface. Let me look further into this and get back to you later.

    Thanks & Best Regards,

    Venkat 

  • Hi Venkat,

    Thanks for the reply. Frankly speaking, I am not that concerned with the number of times the port settings change event is being called. My main concern is the display buffers getting leaked every time a port settings change event is called as I currently dont know how to release the display buffers (mNativeWindow->cancel doesn't free the buffer). If I am able to figure out a way of releasing (freeing) the display buffers being used by decoder, any number of port settings change events are fine with us.

    It will be very very helpful if you can find out and let us know as to how the display buffers can be freed up natively?

    By the way, thanks a ton for your support.

    Regards,

    Pavan D

  • Hi Venkat,

    Do you have any updates regarding my previous post?

    Regards,

    Pavan D

  • Hi Pavan,

    I was checking internally in regards to freeing of display buffers natively. As per my understanding, OMX_FreeBuffer calls should internally invoke the native calls to free the display buffers. Are you doing OMX_FreeBuffer during your port reconfiguration handling?

    I will continue to cross check further on this while waiting for your response.

    Thanks & Best Regards,

    Venkat

  • Hi Venkat,

    We are calling the IOMX equivalent of OMX_FreeBuffer (mOMX->freeBuffer where mOMX is the object) every time a port settings change event occurs. The output port disable wouldn't succeed otherwise. In addition to this we are also calling cancel buffers on the native window.

    Also, is there a sure shot way to confirm whether the display buffers have been freed?

    Regards,

    Pavan D

     

  • Hi Pavan,

    In ICS, the display buffers are tiler buffers and you should see outputs such as below for allocation and freeing of buffers from graphics binaries if the logs are enabled in the logcat output. Can you share your logcat output for further analysis?

    V/gralloc-tiler( 1561): reg: state 0
    I/gralloc-tiler( 1558): NV12 alloc: 672 x 528 (vsize = 3244032) offset da7c000
    V/gralloc-tiler( 1558): alloc: state 0

    I/gralloc-tiler( 1558): NV12 free: 672 x 528 (vptr = 0x42337000) offset da7b000
    V/gralloc-tiler( 1558): free: state 0

    Thanks & Best Regards,

    Venkat

  • Hi Venkat,

    I am not seeing these prints in my system. Are these prints always enabled? Or should I enable these prints somewhere? Also, I dont have source for gralloc.omap4.so.

    Regards,

    Pavan D

  • Hi Pavan,

    These prints might be suppressed as well. The SGX code is generally shared to customers through CDDS. You need to check with TI representative for access to these sources.

    I will continue to find some clues for your queries and post you an update later.

    Thanks & Best Regards,

    Venkat

  • Hi Pavan,

    Could you check verify answer for this post so that this can be closed accordingly as your other query is currently being pursued using the other E2E post: http://e2e.ti.com/support/omap/f/849/t/220759.aspx

    Thanks & Best Regards,

    Venkat