This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

OpenMAX MJPEG decoder




I have been trying to use the DM8168 MJPEG decoder via OpenMAX in EZSDK 5.04.

As shown in the decode_display example, it works fine for decoding a small number of frames.  However, I am having a problem with it keeping going after the first few frames.

It appears that it does not do anything with additional output buffers which are passed to it after the initial setup.  For example, in my program I pass it two output buffers via EmptyThisBuffer before starting to decode anything and then pass more as it gives me those buffers back again.  When I send an input stream into the decoder it returns those two initial buffers filled (with the correct output) but then does not return any more.

It is also possible to replicate this in the decode_display example program by changing the two lines
#define IL_CLIENT_DECODER_INPUT_BUFFER_COUNT   (4)
#define IL_CLIENT_DECODER_OUTPUT_BUFFER_COUNT  (8)
to
#define IL_CLIENT_DECODER_INPUT_BUFFER_COUNT   (8)
#define IL_CLIENT_DECODER_OUTPUT_BUFFER_COUNT  (4)
at the top of ilclient_utils.h.  This change makes it send eight buffers into the decoder with only four total buffers of output (which are passed back in when finished with by the scaler, so the decoder is not running out of buffers).  Only four output frames are produced by the decoder, and the program then hangs waiting for the other four.

Is there some additional setup needed for the MJPEG decoder to work repeatedly which is not present in decode_display?

- Mark


Aside:  DHT blocks (Huffman tables) are required in the JPEG data.  I had to splice the default ones into the stream manually in order to make it work at all with webcam-type output which does not include them.  This should probably be documented somewhere (all the documentation says currently on the matter is "Supports decoding of custom Huffman tables", which to my mind does not imply that custom Huffman tables are in fact mandatory).

  • Here is a better test.  This adds an MJPEG parser to the decode_display example, so that you can feed a proper MJPEG file into it rather than just using the same JPEG file four times.  GIven a file of at least twenty frames or so, it always hangs in my testing.

    - Mark

    Patch to decode_display:

    diff -ur decode_display/src/es_parser.c decode_display.mjpeg/src/es_parser.c
    --- decode_display/src/es_parser.c	2012-04-19 12:22:39.000000000 +0100
    +++ decode_display.mjpeg/src/es_parser.c	2012-08-21 16:05:31.000000000 +0100
    @@ -875,6 +875,60 @@
       return 0;
     
     }
    +
    +void Decode_MJPEGParserInit (MJPEG_ParsingCtx *ctx, void *fp)
    +{
    +  ctx->fp = fp;
    +  ctx->tmp = malloc(CHUNK_TO_READ);
    +  if(!ctx->tmp) {
    +    printf("MJPEG read buffer allocation failed\n");
    +  }
    +  ctx->tmp_len = 0;
    +}
    +
    +unsigned int Decode_GetNextMJPEGFrameSize (MJPEG_ParsingCtx *pc)
    +{
    +  int eof = 0;
    +  unsigned char *tmp = pc->tmp;
    +  size_t len = pc->tmp_len;
    +
    +  while(1) {
    +    if(len < CHUNK_TO_READ) {
    +      size_t rs = fread(tmp + len, 1, CHUNK_TO_READ - len, pc->fp);
    +      if(rs == 0)
    +        eof = 1;
    +      len += rs;
    +    }
    +
    +    int i;
    +    for(i = 0; i < len - 1; i++) {
    +      if(tmp[i] == 0xff && tmp[i+1] == 0xd9) {
    +        // EOI marker.
    +        size_t img_len = i + 2;
    +        memcpy(pc->buff_in + pc->buff_len, tmp, img_len);
    +        if(img_len < len) {
    +          memmove(tmp, tmp + img_len, len - img_len);
    +          len -= img_len;
    +        }
    +        pc->tmp_len = len;
    +        pc->buff_len = 0;
    +        return img_len;
    +      }
    +    }
    +
    +    if(eof)
    +      return 0;
    +
    +    if(len > 1) {
    +      memcpy(pc->buff_in + pc->buff_len, tmp, len - 1);
    +      pc->buff_len += len - 1;
    +      tmp[0] = tmp[len - 1];
    +      len = 1;
    +    }
    +  }
    +}
    +
    +
     /******************************************************************************\
     *      Decode_VDEC_Reset_Parser Function Declaration
     \******************************************************************************/
    diff -ur decode_display/src/es_parser.h decode_display.mjpeg/src/es_parser.h
    --- decode_display/src/es_parser.h	2012-04-19 12:22:39.000000000 +0100
    +++ decode_display.mjpeg/src/es_parser.h	2012-08-21 16:05:31.000000000 +0100
    @@ -174,6 +174,15 @@
       unsigned char *buff_in;
     } MPEG2_ParsingCtx;
     
    +typedef struct
    +{
    +  FILE *fp;
    +  unsigned char *buff_in;
    +  size_t buff_len;
    +  unsigned char *tmp;
    +  size_t tmp_len;
    +} MJPEG_ParsingCtx;
    +
     typedef MPEG4_ParsingCtx H263_ParsingCtx;
     typedef MPEG4_ParsingCtx VC1_ParsingCtx;
     
    @@ -259,6 +268,7 @@
     unsigned int Decode_GetNextMpeg4FrameSize (MPEG4_ParsingCtx * pc);
     unsigned int Decode_GetNextVC1FrameSize (VC1_ParsingCtx * pc);
     unsigned int Decode_GetNextMpeg2FrameSize (MPEG2_ParsingCtx *pc);
    +unsigned int Decode_GetNextMJPEGFrameSize (MJPEG_ParsingCtx *pc);
     
     void Decode_VDEC_Reset_Parser (void *parserPtr);
     void Decode_ParserInit (H264_ParsingCtx * pc, void *fp);
    @@ -266,6 +276,7 @@
     void Decode_Mpeg4ParserInit (MPEG4_ParsingCtx *pc, void *fp);
     void Decode_H263ParserInit (H263_ParsingCtx *pc, void *fp);
     void Decode_VC1ParserInit(VC1_ParsingCtx *ctx, void *fin);
    +void Decode_MJPEGParserInit (MJPEG_ParsingCtx *ctx, void *fp);
     
     #ifdef __cplusplus              /* matches __cplusplus construct above */
     }
    diff -ur decode_display/src/ilclient.c decode_display.mjpeg/src/ilclient.c
    --- decode_display/src/ilclient.c	2012-08-13 11:06:35.000000000 +0100
    +++ decode_display.mjpeg/src/ilclient.c	2012-08-21 16:05:20.000000000 +0100
    @@ -754,7 +754,7 @@
       }  
       else if(pAppData->codingType == OMX_VIDEO_CodingMJPEG)
       {
    -    printf(" first frame of jpeg stream will be decoded \n");
    +    Decode_MJPEGParserInit (&pAppData->pcmjpeg, pAppData->fIn);
       }  
     
       /* Initialize application / IL Client callback functions */
    diff -ur decode_display/src/ilclient.h decode_display.mjpeg/src/ilclient.h
    --- decode_display/src/ilclient.h	2012-04-19 12:22:40.000000000 +0100
    +++ decode_display.mjpeg/src/ilclient.h	2012-08-21 16:05:28.000000000 +0100
    @@ -226,6 +226,7 @@
       H263_ParsingCtx pch263;
       MPEG4_ParsingCtx pcmpeg4;
       MPEG2_ParsingCtx pcmpeg2;
    +  MJPEG_ParsingCtx pcmjpeg;
       OMX_VIDEO_CODINGTYPE  codingType;
       void *fieldBuf;
       IL_CLIENT_COMP_PRIVATE *decILComp;
    diff -ur decode_display/src/ilclient_utils.c decode_display.mjpeg/src/ilclient_utils.c
    --- decode_display/src/ilclient_utils.c	2012-04-19 12:22:40.000000000 +0100
    +++ decode_display.mjpeg/src/ilclient_utils.c	2012-08-21 16:05:23.000000000 +0100
    @@ -740,6 +740,11 @@
       pAppData->pcmpeg2.buff_in = pBuf->pBuffer;
       frameSize = Decode_GetNextMpeg2FrameSize (&pAppData->pcmpeg2);
      }
    + else if(pAppData->codingType == OMX_VIDEO_CodingMJPEG)
    + {
    +  pAppData->pcmjpeg.buff_in = pBuf->pBuffer;
    +  frameSize = Decode_GetNextMJPEGFrameSize (&pAppData->pcmjpeg);
    + }
      else {
        /* for keeping the image on display timeout */
        sleep(2);
    @@ -921,15 +926,9 @@
          }
          else if(pAppdata->codingType == OMX_VIDEO_CodingMJPEG)
          {
    -      /* pass same data to all the buffers; mjpeg decode example does not have 
    -         parser and only single image is decoded and displayed */
    -      fseek(pAppdata->fIn, 0, SEEK_END);
    -      frameSize = ftell(pAppdata->fIn);
    -      fseek(pAppdata->fIn, 0 , SEEK_SET);
    -      printf(" reading input file of size %d bytes into input buffer \n ", frameSize); 
    -      fread (decILComp->inPortParams->pInBuff[i]->pBuffer, 1, frameSize, pAppdata->fIn); 
    -      fseek(pAppdata->fIn, 0 , SEEK_SET);
    -      }
    +         pAppdata->pcmjpeg.buff_in = decILComp->inPortParams->pInBuff[i]->pBuffer;
    +         frameSize = Decode_GetNextMJPEGFrameSize (&pAppdata->pcmjpeg);
    +     }
     
         /* Exit the loop if no data available */
         if (!frameSize)
    

    An example MJPEG file (720p, 64 frames): 8130.test.mjpeg.txt

  • Mark,

    Yes, in SDK there is no MJPEG parser hence it would not be able to decode MJPEG. Thanks fro sharing the parser, we will see if we can include this in next release.

    Regards

    Vimal

  • This is tracked in SDOCM00095981.

    Note there was an error in parser that was uploaded. The corrected version of Decode_GetNextMJPEGFrameSize() from es_parser.c is attached (updated 26/09/12 to handle eof correctly)

  • I am also not able to play MJPEG video.

    I am using DM8148 EVM board with EZSDK 05_04_00_11.

    is it same issue that parse is not present?

    How shall I include parser to enable mjpeg playback?

    Kindest Regards,

    Salim

  • Salim,

    The same change will work on the DM8148. You just need to merge the changes in the attached es_parser.c into the es_parser.c you have in the build.

    Alternatively try the latest 5.05 (http://software-dl.ti.com/dsps/dsps_public_sw/ezsdk/latest/index_FDS.html) which should have the changes included.

    Iain

  • Thanks Iain,

    I tried to search in EZSDK 05_04. es_pasrer.c is located under the omx demo folder.

    How can I integrate it in GStreamer, so that playbin2 can recognize the parser? Does GStreamer in EZSDK 05.05 include this modification?

    Thanks and

    Kindest Regards

  • There are two answers to this question.

    1. the es_parser.c fix is in EZSDK 5.05 but it only works for the OMX examples as you have seen.

    2. gstreamer in 5.05 has another bug that prevents the gstreamer omx_mjpeg decode component from working. The fix for that is attached to this post and is included in the define SDOCM00097118.

    As an example here are instructions for creating an example gstreamer source (a PC) and sink (DM81x8) to do an rtp streaming of an MJPEG file in an AVI container that shows an example of how to use the accelerated gstreamer mjpeg decode component. 

    The demo uses an MJPEG file encapsulated in AVI container that is streamed to the DM814x EVM’s IP address.

    Run following command line on the PC

    $ gst-launch-0.10 filesrc location=/home/ibc/tekken_mjpeg_avidemx.avi ! avidemux name=demux demux.video_00 ! rtpjpegpay pt=96 ! udpsink host=137.167.88.34 port=5000 -v

    137.167.88.34 needs to be replaced by IP address of DM814x EVM on your network.

    The –v logs some extra debug info that may be required to correctly describe the caps (capabilities) of the stream to the decoder. The log of interest is shown below in green:

    Pipeline is PREROLLING ...

    /GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, payload=(int)96, ssrc=(uint)2006761866, clock-base=(uint)2211117719, seqnum-base=(uint)46792 

     

    On the target run the following command line with the text in green copied from the caps listed by the server:

    # gst-launch udpsrc port=5000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, payload=(int)96, ssrc=(uint) 2006761866, clock-base=(uint) 2211117719,seqnum-base=(uint) 46792" ! rtpjpegdepay ! queue ! jpegparse ! omx_mjpegdec ! omx_scaler ! omx_ctrl display-mode=OMX_DC_MODE_1080P_60 ! gstperf ! omx_videosink

     

     

    sdocm00097118.zip
  • Thanks Iain,

    The patch works with EZSDK05.05 (Does not work with EZSDK05.04).

    I tried the following

    gst-launch filesrc location =<file> ! avidmux ! multiqueue ! omx_mjpegdec ! queue ! ffmpegcolorspace ! omx_scaler ! v4l2sink.   

    It works.

    However, the decoder is not recognized inside decodebin. That is why, the clip can not be played via playbin2 approach.

    I need to get it working by playbin2 approach. Any suggestions.

    Kindest Regards,

    Salim

  • Salim,

    As far as I am aware there is no support for the TI omx plugins in decodebin or playbin2. I've only ever seen pipelines that explicitly describe the pipeline and I'm not aware of any plans to add this.

    So I'm afraid I have no suggestions for you.

    Iain

  • lain,

    other omx decoders like omx_h264dec etc work well with decodebin and playbin2. i have used them in my application.

    rgds

    salim