This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

DM368 - H264Dec TestApp (DVDSK 4_02_00_06, h264dec 02.00.00.13) Question

Hi,


I've been working off and on, more off - than on, for sometime now on project to decode and H264 video from an IP camera using a DM368, do some work on the frames and if required save a JPEG.

Essentially, I want to feed the decoder a stream of NAL packets which contain slices of a frame... I'm having trouble finding any examples / sample code to help develop this solution.

I've played around with the example applications in the DVSDK and DMAI, but I can't seem to be bend them into shape in order to produce any sensible output, or if it does produce proper decoded output - will only run at around 4 fps. I'm only trying to achieve 15 fps (the camera is set to supply at 15 fps).

I've absolutely no problems decoding from file...the problem is the data is not coming from file, but a live streamed source.

So, recently I've been looking at the decoder test app supplied with the codec and have a couple of questions:

1) Do I need to do everything that is done in the test application? Such as set up the resource manager?

2) There seems to duplication of code within the testapp from various parts of the DVSDK - do I need to duplicate this too, or can I just use the DVSDK?

3) Any other ideas? Is there a better example?

  • Hi,

    Martin1980 said:
    1) Do I need to do everything that is done in the test application? Such as set up the resource manager?

    No you need not take whole code, if you are interested in NAL units decoding, please check code under LOW_LATENCY macro.

    Martin1980 said:
    2) There seems to duplication of code within the test app from various parts of the DVSDK - do I need to duplicate this too, or can I just use the DVSDK?

    Can you point which code is duplicated? from test app take only processing, creating and other stuff will be in place in DVSDK.

    Martin1980 said:
    3) Any other ideas? Is there a better example?

    For NAL unit decoding test app is the only option, DVSDK or other SDK will not have this. Which resolution are you decoding? Can u send complete param to check 4fps.
  • Thanks for the quick response :)


    1) I started looking at the LOW_LATENCY_FEATURE macro, as directed in the decoder user guide, but in the testapp code I see the following, which appears contradictory:

    #ifdef LOW_LATENCY_FEATURE
        params.inputDataMode = IH264VDEC_TI_ENTIREFRAME;
        params.sliceFormat   = IH264VDEC_TI_BYTESTREAM;
    #endif //LOW_LATENCY_FEATURE

    I would expect it to be the opposite; where the inputDataMode is set to IH264VDEC_TI_SLICEMODE and the sliceFormat is set to IH264VDEC_TI_NALSTREAM.

    2) The duplication, for example:

    h264vdec.{h|c}

    alg.h

    alg_{create|control|malloc}.c

    Also some files like: lockmp.c, which contain functionality that I'd expect to find in the DVSDK / OSAL layer.

    I'm currently attempting to decode 480P, but the intention is to be able to decode any resolution up to 1080P.

    When you ask for params - do you mean the decoder params?

    I find the algorithm creation and control code a bit difficult to follow / understand in the testapp, would it be worth looking at how the DMAI vdec2 implementation creates and controls the codec - or is that still a layer too high? Also, for what it's worth, I am interested in using the codec with interrupts, to enable me to do other things while the codec is running...is there anything special that I have to do in order to use the decoder in interrupt mode (I refer to the code in the testapp_arm926int.c and testapp_inthandler)? Or is it a simple case of calling HDVICPSYNC_wait() ?

    Also one other; can you foresee any problems with me driving the H264 decoder in a manual way (allocating resources, etc), and driving the JPEG encoder via DMAI?

    Cheers,

    Martin

  • I'm also a little confused about the various ways to create and control the codec.

    Should I interface the codec through the codec-engine; viddec2 interface? Or follow the testapp example, directly creating the algorithm (ALG_Create) etc?