This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

OMX VENC dynamic resolution change



Hi,

I have read all the OMX docs, including the H264 Encoder User Guide and cannot find a solution to this.

Is there a way to get the OMX VENC component to handle changes to the size of the input video frames dyamically?

The input/output port configuration cannot be changed once the component is in an executing state.

Input video is captured at 1080p. So the VENC buffers and input/output ports are configured for 1080p and this works fine. However, the input might then be re-scaled to 720p or D1 by the DEI component. This is given to the VENC component but the video comes out corrupted.

I have tried setting the VENC dynamic params as follows but this does not work:

tDynParams.videoDynamicParams.h264EncDynamicParams.videnc2DynamicParams.inputWidth = 1280;

tDynParams.videoDynamicParams.h264EncDynamicParams.videnc2DynamicParams.inputHeight = 720;

tDynParams.videoDynamicParams.h264EncDynamicParams.videnc2DynamicParams.captureWidth = 1280;

Is there any way to get the VENC to handle size changes? I have access to the overlay code if there are any changes needed in there?

Thanks,

Steven

  • This is what the image looks like when I change the size.

    Any ideas what the problem is?

    What stride value should I be using?

    Steven

  • Hi,

    Can someone from TI please comment. We need urgent support on this.

    Thanks,

    Steven

  • Hi,

    Can someone at TI URGENTLY respond to this please. I need support on this.

    Thanks,

    Steven

  • Here is the function I am using that does not work:

    OMX_ERRORTYPE IL_ClientUpdateVENCResolution(IL_Client *pAppData, int width, int height)
    {
      OMX_ERRORTYPE eError=OMX_ErrorNone;
      OMX_HANDLETYPE pHandle=pAppData->pEncHandle;
      OMX_VIDEO_CONFIG_DYNAMICPARAMS tDynParams;
      OMX_CONFIG_VIDCHANNEL_RESOLUTION chResolutionNew;
    
      int captureWidth, captureHeight;
      captureWidth = pAppData->nWidth;
      captureHeight = pAppData->nHeight;
    
      OMX_INIT_PARAM (&chResolutionNew);
      chResolutionNew.Frm0Width = captureWidth;
      chResolutionNew.Frm0Height = captureHeight;
      chResolutionNew.Frm0Pitch = chResolutionNew.Frm0Width * 2;
      chResolutionNew.Frm1Width = width;
      chResolutionNew.Frm1Height = height;
      chResolutionNew.Frm1Pitch = width;
      chResolutionNew.FrmStartX = 0;
      chResolutionNew.FrmStartY = 0;
      chResolutionNew.FrmCropWidth = 0;
      chResolutionNew.FrmCropHeight = 0;
      chResolutionNew.eDir = OMX_DirOutput;
      chResolutionNew.nChId = 0;
    
      eError = OMX_SetConfig (pAppData->pDeiHandle,
                              (OMX_INDEXTYPE) OMX_TI_IndexConfigVidChResolution,
                              &chResolutionNew);
      if (eError != OMX_ErrorNone)
      {
          ERROR ("FAILED TO RECONFIGURE DEI -> VENC SCALE\n");
      }
      printf( "RECONFIGURED DEI OUTPUT TO VENC\n" );
    
      OMX_INIT_PARAM (&tDynParams);
      tDynParams.nPortIndex = OMX_VIDENC_OUTPUT_PORT;
    
      eError = OMX_GetParameter(pHandle,OMX_TI_IndexParamVideoDynamicParams,&tDynParams);
        if(eError!=OMX_ErrorNone)
            NetLogf(pLog,"DYNAMIC PARAMS Request Error %i",eError);
    
      tDynParams.videoDynamicParams.h264EncDynamicParams.videnc2DynamicParams.inputWidth = width;
      tDynParams.videoDynamicParams.h264EncDynamicParams.videnc2DynamicParams.inputHeight = height;
      tDynParams.videoDynamicParams.h264EncDynamicParams.videnc2DynamicParams.captureWidth = width;
      eError = OMX_SetConfig(pHandle,OMX_TI_IndexConfigVideoDynamicParams,&tDynParams);
        if(eError!=OMX_ErrorNone)
            NetLogf(pLog,"DYNAMIC PARAMS Request Error %i",eError);
    
        printf( "VENC DYNAMIC PARAMS UPDATED!!!!!\n" );
    
        return eError;
    }

  • Hi Steven,

    With same function, I am able to change the encoder resolution and it is taking the effect. 

    Are you using EZSDK 05.05.02.00? I could use this with this version.

    Thanks

    Ram

  • Hi Ram,

    Yes I am using v5.05.02.00.

    Are you using the capture-encode sample?

    Can you upload your code? If not then I can give you an email address to send it to.

    Thanks,

    Steven

  • Please share your email address

    Ram

  • Hi Ram,

    My email address is steven dot harris at digitalbarriers dot com

    Thanks,

    Steven

  • Hi Ram,

    Having run the code you provided I found the following lines of code in IL_ClientProcessPipeCmdETB:

      /* For a proper chroma offset setting */
      if(gEncILComp == thisComp)
      {
        pBufferIn->nFilledLen = org_size;
      }

    org_size is being set during initialisation as follows:

      org_size = pAppData->nWidth * pAppData->nHeight * 3 / 2;

    Using this along with my function produces the correct image when the VENC input size is changed.

    Thanks, Steven