This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

DM8168 Hybrid DVR

Hi all,

We newly purchased UDWorks DM8168 Hybrid DVR with dvrrdk_v03.50.00.08. It says hybrid but I could not see an use case for using NVR and DVR function at the same time. Am I right?

If this is the case what you suggest for making an ie. 8 ch NVR + 8 ch DVR in one usecase. Should I use link API? Or can I still use some mix of existing use cases? For example mixing 32D1_Decode_Display and 16D1_DVR?

Best Regards.

  • The usecase used by UDDVR GUI application is:

    16 ch D1 capture + 16 ch D1 @ 30 fps H264 encode + 16 ch CIF @ 30 fps H264 encode + 16 ch D1 @ 1fps MJPEG encode

    16 ch D1 H264 decode

    2 independent HD display + 1 SD display.

    As you can see the usecase already supports upto 16 ch decode.

    The 16 channel decode can be either playback from HDD or playback from network.

    So the usecase itself supports DVR+NVR configuration in the sense you can do simultaneous encode and decode.

    You will have to develop an application that has DVR+NVR functionality but can use the same usecase.

  • Dear Badri,

    I am very glad for your valuable informations.

    In that case what you suggest me to understand the whole flow of these usecase? I am planning to go step by step on the dvr_rdk/demos/mcfw_api_demos/mcfw_demo to understand the usecases. But there is an another option to analyze dvr_rdk/dvrapp/app. I think demo is more fundamental, but app is looking more application specific.

    It seems I should develop my own dvr_rdk/dvrapp/app by using mcfw_demo. Am I right?

    Another issue is what you suggest/use for debugging these A8 side applications? gdb? printfs? or any other option?

    Best. 

  • The mcfw demo is just a cmd line demo . It may not be very useful to go thru it apart from understanind go to get encoded bitstream and send bitstream for decoding.

    To understand the usecase you can go thru DM816x_DVR_RDK_UseCaseGuide_16D1_DVR document under usecase folder.

    Also you can go thru DVR_RDK_McFW_Link_API_Training to understand basics and difference between usecase and application.

    You are right you will have to develop your own dvrapp and nvrapp .

    For debugging you can use either gdb or prints whichever you find more easy.

  • Dear Badri,

    Thanks for your val. informations again.

    I decided to go thru mcfw demo. I run demo app (dvr_rdk_demo_mcfw_api.out) with  demoId=DEMO_VCAP_VENC_VDIS and the mosaicParams layoutId is DEMO_LAYOUT_MODE_1CH. I have one cam connected to ch0. And using a HDMI screen at HDMI0 port. I am waiting an one chanelled screen layout. The screen layout is true. But I am seeing only the upper left quarter of the camera picture on the screen.

    For solving this, I had to add  the following code after the line 231 at demo_vcap_venc_vdis.c.

    OSA_waitMsecs(1000);
    Vdis_setMosaicParams(VDIS_DEV_HDMI, &vdisParams.mosaicParams[0]);

    I am very new but it seems mosaic parameters didn't succesfully set at the former sections. When I remove OSA_wait line the code again fails with the same situation. Is this related to a lot of system printf lags?

    Best. 

  • Did you change Demo_swMsGenerateLayout parameter to DEMO_LAYOUT_MODE_1CH to make 1x1 layout ?

    I am not sure what line 231 is but calling Vdis_setMosaicParams after Vdis_start should work. You can check the cmd line option under "Display Settings" to change swms layout.

  • Dear Badri,

    Thank you for your answers and patience. I want to ask two more questions.

    If I understand true, VDIS accepts two inputs from VDEC and VCAP with the suitable use case. But how can I arrange the mapping of the which one to be showed in mosaics. For example, I want a 16 channeled mosaic layout on the same screen, 8 from VCAP and 8 from VDEC. How can I make this. I hope the question is clear.

    The other question is there is a rtsp demo in demo folder. rtsp_demo is using rtsp_lib/librtsprx.a library that I cannot notice the sources of the library. Is it a standart library or something handmade? 

    Best.

  • SWMS supports input channel to window mapping. The API is Vdis_setMosaicParams.

    The input channel numbers corresponding to capture preview and decode channels depends on your usecase data flow.

    Typically channel 0 - 15 is preview channel and 16 - 32 is decode channel.

    If you want to display 8 capture and 8 decode channels select a 16 window swms layout and map ch 0 - 7 and ch 16 - 23 to the windows

  • Hi Badri,

    Now I am using VSYS_USECASE_MULTICHN_PROGRESSIVE_VCAP_VDIS_VENC_VDEC usecase and mapped the channels as;

    vdMosaicParam->chnMap[0] = 16;
    vdMosaicParam->chnMap[1] = 0;

    I am waiting ch[0] of display from decoder and ch[1] form capture. capture side is working fine. But decode thread is failing with the following error. Do you have any suggestion about the issue?

    [host]
    mg -------------->emptyBufList->numBufs = 64

    [host] Vdec_bitsRdSendFxn: periodic print.. [m3video] Unhandled Exception:
    [m3video] Exception occurred in ThreadType_Task
    [m3video] handle: 0x3cf21ff8.
    [m3video] stack base: 0x3d956780.
    [m3video] stack size: 0x8000.
    [m3video] R0 = 0x22400444 R8 = 0xffffffff
    [m3video] R1 = 0xfda2fc00 R9 = 0x3d51f000
    [m3video] R2 = 0x3d95e548 R10 = 0xbf4c1a60
    [m3video] R3 = 0x00000140 R11 = 0x00000000
    [m3video] R4 = 0x00000000 R12 = 0x9cc3b579
    [m3video] R5 = 0x00000003 SP(R13) = 0x3d95e540
    [m3video] R6 = 0x00000004 LR(R14) = 0x000bc8a4
    [m3video] R7 = 0x000bb6f4 PC(R15) = 0x00312c36
    [m3video] PSR = 0x81000000
    [m3video] ICSR = 0x0440f803
    [m3video] MMFSR = 0x00
    [m3video] BFSR = 0x82
    [m3video] UFSR = 0x0000
    [m3video] HFSR = 0x40000000
    [m3video] DFSR = 0x00000000
    [m3video] MMAR = 0xfdaec4a4
    [m3video] BFAR = 0xfdaec4a4
    [m3video] AFSR = 0x00000000
    [m3video] Terminating Execution...

    [host]
    mg -------------->emptyBufList->numBufs = 64

    [host]
    mg -------------->emptyBufList->numBufs = 0

    [host]
    mg -------------->emptyBufList->numBufs = 0

    The decode section of my code is follows;

    VdecVdis_IpcBitsCtrl gVdecVdis_obj;

    void VdecVdis_OpenFileHandles()
    {
    gVdecVdis_obj.fdRdData = open("/opt/VID_CH00.h264",O_RDONLY);
    OSA_assert(gVdecVdis_obj.fdRdData > 0);

    gVdecVdis_obj.fpRdHdr = fopen("/opt/VID_CH00.h264.hdr","r");
    OSA_assert(gVdecVdis_obj.fpRdHdr != NULL );
    }

    void VdecVdis_bitsRdSendFullBitBufs( VCODEC_BITSBUF_LIST_S *fullBufList)
    {
    if (fullBufList->numBufs)
    {
    Vdec_putBitstreamBuffer(fullBufList);
    }
    }

    void VdecVdis_bitsRdFillEmptyBuf(VCODEC_BITSBUF_S *pEmptyBuf)
    {
    int statHdr, statData;
    int curCh;

    curCh = pEmptyBuf->chnId;

    if(gVdecVdis_obj.fpRdHdr == NULL)
    return;


    statHdr = fscanf(gVdecVdis_obj.fpRdHdr,"%d",&(pEmptyBuf->filledBufSize));

    OSA_assert(pEmptyBuf->filledBufSize <= pEmptyBuf->bufSize);

    statData = read(gVdecVdis_obj.fdRdData, pEmptyBuf->bufVirtAddr, pEmptyBuf->filledBufSize);

    if( feof(gVdecVdis_obj.fpRdHdr) || statData != pEmptyBuf->filledBufSize)
    {
    OSA_printf(" CH%d: Reached the end of file, rewind !!!", curCh);
    clearerr(gVdecVdis_obj.fpRdHdr);

    rewind(gVdecVdis_obj.fpRdHdr);
    lseek(gVdecVdis_obj.fdRdData, 0, SEEK_SET);
    statHdr = fscanf(gVdecVdis_obj.fpRdHdr,"%d",&(pEmptyBuf->filledBufSize));

    OSA_assert(pEmptyBuf->filledBufSize <= pEmptyBuf->bufSize);
    statData = read(gVdecVdis_obj.fdRdData, pEmptyBuf->bufVirtAddr, pEmptyBuf->filledBufSize);
    }
    }

    void VdecVdis_bitsRdReadData(VCODEC_BITSBUF_LIST_S *emptyBufList)
    {
    static int mg_dbg_cntr = 0;
    VCODEC_BITSBUF_S *pEmptyBuf;
    Int i;
    if(mg_dbg_cntr < 5){
    OSA_printf("\nmg -------------->emptyBufList->numBufs = %d\n",emptyBufList->numBufs);
    mg_dbg_cntr++;
    }

    for (i = 0; i < emptyBufList->numBufs; i++)
    {
    pEmptyBuf = &emptyBufList->bitsBuf[i];
    VdecVdis_bitsRdFillEmptyBuf(pEmptyBuf);
    }
    }


    void VdecVdis_bitsRdGetEmptyBitBufs(VCODEC_BITSBUF_LIST_S *emptyBufList)
    {
    VDEC_BUF_REQUEST_S reqInfo;

    emptyBufList->numBufs = 0;

    reqInfo.numBufs = VCODEC_BITSBUF_MAX;
    reqInfo.reqType = VDEC_BUFREQTYPE_BUFSIZE;

    Vdec_requestBitstreamBuffer(&reqInfo, emptyBufList, 0);
    }

    void *Vdec_bitsRdSendFxn(Void * prm)
    {
    VCODEC_BITSBUF_LIST_S emptyBufList;
    // UInt32 resId;

    static Int printStatsInterval = 0;


    OSA_semWait(&gVdecVdis_obj.thrStartSem,OSA_TIMEOUT_FOREVER);
    while (FALSE == gVdecVdis_obj.thrExit)
    {
    OSA_waitMsecs(MCFW_IPCBITS_SENDFXN_PERIOD_MS);

    VdecVdis_bitsRdGetEmptyBitBufs(&emptyBufList);

    VdecVdis_bitsRdReadData(&emptyBufList);

    VdecVdis_bitsRdSendFullBitBufs(&emptyBufList);

    if ((printStatsInterval % MCFW_IPCBITS_INFO_PRINT_INTERVAL) == 0)
    {
    OSA_printf("Vdec_bitsRdSendFxn: periodic print..");
    }

    printStatsInterval++;
    }

    return NULL;
    }

  • Are you getting the M3 exception only after changing the swms window to channel mapping ? Does it otherwise work ?

     Do you get any decode errors reported ? What is the DVRRDK release you are using

    Share the full console logs

  • Dear Badri,

    Thanks for the reply.

    First I noticed in DVR RDK API Guide's VDEC_BUF_REQUEST_S::minBufSize section, it says; "Minimum size of buffer's needed . This member should be set if ipcBitsOutLink was created with bufPoolPerCh was set to FALSE"

    Then I added the following code for doing this;

    #define PB_DEFAULT_WIDTH (720)
    #define PB_DEFAULT_HEIGHT (576)
    #define PB_GET_BITBUF_SIZE(width,height) (((width) * (height) * (2))/2)
    void VdecVdis_bitsRdGetEmptyBitBufs(VCODEC_BITSBUF_LIST_S *emptyBufList)
    {
    int j;
    UInt32 bitBufSize;
    VDEC_BUF_REQUEST_S reqInfo;

    emptyBufList->numBufs = 0;
    bitBufSize = PB_GET_BITBUF_SIZE(PB_DEFAULT_WIDTH, PB_DEFAULT_HEIGHT);

    reqInfo.numBufs = VCODEC_BITSBUF_MAX;
    reqInfo.reqType = VDEC_BUFREQTYPE_BUFSIZE;

    for (j = 0; j < VCODEC_BITSBUF_MAX; j++)
    {
    reqInfo.u[j].minBufSize = bitBufSize;
    }

    Vdec_requestBitstreamBuffer(&reqInfo, emptyBufList, 0);
    }

    But this seems not enough, I had to set the chId and timestamp values of the requested buffer as follow;

    #define VDEC_VDIS_FRAME_DURATION_MS (33)

    static Void VdecVdis_setFrameTimeStamp(VCODEC_BITSBUF_S *pEmptyBuf)
    {
     static int frameCnt = 0;
     UInt64 curTimeStamp = frameCnt * VDEC_VDIS_FRAME_DURATION_MS;
     pEmptyBuf->chnId = 0;
     pEmptyBuf->lowerTimeStamp = (UInt32)(curTimeStamp & 0xFFFFFFFF);
     pEmptyBuf->upperTimeStamp = (UInt32)((curTimeStamp >> 32)& 0xFFFFFFFF);
     if (0 == frameCnt)
     {
      UInt32 displayChId;

      Vdec_mapDec2DisplayChId(VDIS_DEV_HDMI,pEmptyBuf->chnId,&displayChId);
      Vdis_setFirstVidPTS(VDIS_DEV_HDMI,displayChId,curTimeStamp);
     }
     frameCnt += 1;
    }

    Now it seems fine. But actually I dig in the demo codes for the differences. I understand why chId is required but cannot understand why timestamp is so important.

    Best.

     

  • The usecase you are using sets  bufPoolPerCh as FALSE.

    Once you get an empty buffer it is mandatory that app set

    chnId and filledBufSize correctly and ensure filledBufSize is less than allocBufSize.

    Timestamp need not be set if avsync is disabled (Avsync is disabled by default in the usecase you are using.

    Refer /dvr_rdk/demos/mcfw_api_demos/mcfw_demo/demo_vcap_venc_vdec_vdis_bits_rdwr.c to see what fields should be set when feeding frames for decode.