This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Understaning of SCD implemenation on DM8148

Hello,

In the current Syslink example SCD, what I understood is as below

1. VPSS captures the video frame and copies it into the shared memory.

2. DSP locks that part of the shared memory  until it finishes the processing on that frame.

3. DSP unlocks the shared memory and VPSS display will display the processed video.

My doubts are as below,

1. Until Capture releases the buffer, will DSP and Display be in idle?

2. Same when DSP is processing, will capture and display in idle mode?

if so, how can we achieve the pipeline here?

  • Your understanding is not correct.

    This is how it actually works:

    1. Multiple buffers are allocated by the captureLink.

    2. The buffer are queued to the capture driver .

    3. The HDVPSS hardware will copy the frame received from the capture source to these empty buffers.

    4. Once entire frame is captured it is dequeud and the ownership of the buffer is transferred to the next link.

         In your case this is the DSP algorithm.

    5. Capture continue to capture the next frame into the next free buffer while the DSP is processing the previous frame.

     

    There is no locking anywhere and no module will be idle waiting for another module to complete processing.

  • Hi Badri,

    When you say,

    3. The HDVPSS hardware will copy the frame received from the capture source to these empty buffers.

    will it be memcpy or pointer exchange?

    Further to this I have another two questions,

    1. Important question,

    I want to validate my algorithm using know input video using FILE I/O operation, In the present example how can I achieve this?

     Or any example code would be helpful.

    2. Difference between IPCFrames In/Out and IPCbits In/Out. IPCFrames is used to exchange raw frames where as IPCbits is used to exchange bitstreams(encoded bitsream)?.

    can we use IPCbits to exchange raw frames, like IPCFrames?

  • nandish mj said:
    will it be memcpy or pointer exchange?

    - No copy only pointer exchange

    nandish mj said:

    I want to validate my algorithm using know input video using FILE I/O operation, In the present example how can I achieve this?

     Or any example code would be helpful.

    - You can use ipcFramesOutHost -> ipcFramesInDsp -> algLink to test algorithm using FILE I/O. There is no ready made usecase. You will have to create a usecase. You can refer http://e2e.ti.com/support/dsp/davinci_digital_media_processors/f/717/p/292593/1022962.aspx#1022962 which does FileRead->Encode->FileWrite and adapt it to your requirement.

     

    nandish mj said:
    2. Difference between IPCFrames In/Out and IPCbits In/Out. IPCFrames is used to exchange raw frames where as IPCbits is used to exchange bitstreams(encoded bitsream)?.

    Refer link trainining document part of DVRRDK for info. Yes IpcFrames is to exchange video frames and ipcBits is to exchange encoded bitstream data .You cant use ipcBits to exchange video frames

  • Hi Badri,

    I was looking in Ti_vap.c file. I believe this is the file which transfers the captured frame to DSP. Could you please let me know where exactly it happens so that I can quickly change that frame buffer with my known data to test my algorithm.

    I tried modifying in Vcap_getFullVideoFrames(), but didn't help me.

    Implementing File I/O link is taking more time for me.

  • ti_vcap  doesn't transfer frames to DSP. If you have captureLink connected to algLink on c674 then the frames are directly transfered from captureLink to algLink . If you want to modify the contents of the buffer you will have to modify the algLink to replace the input buffer contents with content you want.

  • Hi Badri,


    I modified the code as below design,

    ipcFramesOutHost -> ipcFramesOutVpss -> ipcFramesInDsp -> dspAlg

    with this am getting an error as below,

    [c6xdsp ]  107246: Assertion @ Line: 107 in links_common/system/system_ipc_listMP.c: 0 : failed !!!

    Here by am attaching code and error log,

    #include <vpu_demo.h>
    #include "demo_scd_bits_wr.h"
    #include "vpu_get_files.h"
    #include "vpu_status_report.h"
    
    #include "vpu_streamer_cfg.h"
    #include "vpu_utils.h"
    #include "demo_running_buff.h"
    #include <signal.h>
    
    #include <mcfw/interfaces/link_api/vidframe.h>
    //#include <mcfw/interfaces/link_api/ipcLink.h>
    
    #include "ais_fileRd.h"
    
    extern Int32 IpcFramesOutLink_putFullVideoFrames(UInt32 linkId,
                                              VIDFrame_BufList *bufList);
    extern Int32 IpcFramesOutLink_getEmptyVideoFrames(UInt32 linkId,
                                               VIDFrame_BufList *bufList);
    
    
    #define FILERD_TRACE_ENABLE_FXN_ENTRY_EXIT           (0)
    #define FILERD_TRACE_INFO_PRINT_INTERVAL             (8192)
    
    
    #if FILERD_TRACE_ENABLE_FXN_ENTRY_EXIT
    #define FILERD_TRACE_FXN(str,...)                    do {                           \
                                                         static Int printInterval = 0;\
                                                         if ((printInterval % FILERD_TRACE_INFO_PRINT_INTERVAL) == 0) \
                                                         {                                                          \
                                                             OSA_printf("TI_FILERD:%s function:%s",str,__func__);     \
                                                             OSA_printf(__VA_ARGS__);                               \
                                                         }                                                          \
                                                         printInterval++;                                           \
                                                       } while (0)
    #define FILERD_TRACE_FXN_ENTRY(...)                  FILERD_TRACE_FXN("Entered",__VA_ARGS__)
    #define FILERD_TRACE_FXN_EXIT(...)                   FILERD_TRACE_FXN("Leaving",__VA_ARGS__)
    #else
    #define FILERD_TRACE_FXN_ENTRY(...)
    #define FILERD_TRACE_FXN_EXIT(...)
    #endif
    
    ///////////FILE IO//////////////
    
    int ENC_CH = 1;
    AIS_fileRD_IpcFramesBufObj  frmObj[8];
    ReadYUV_Config  gReadYUVConfig;
    
    static Void fileRd_copyVidFrameInfoLink2McFw(VIDEO_FRAMEBUF_S *dstBuf,
                                               VIDFrame_Buf    *srcBuf)
    {
        Int i,j;
        OSA_assert(VIDEO_MAX_FIELDS == VIDFRAME_MAX_FIELDS);
        OSA_assert(VIDEO_MAX_PLANES == VIDFRAME_MAX_PLANES);
    
        for (i = 0; i < VIDEO_MAX_FIELDS; i++)
        {
            for (j = 0; j < VIDEO_MAX_PLANES; j++)
            {
                dstBuf->addr[i][j] = srcBuf->addr[i][j];
                dstBuf->phyAddr[i][j] = srcBuf->phyAddr[i][j];
            }
        }
        dstBuf->channelNum  = srcBuf->channelNum;
        dstBuf->fid         = srcBuf->fid;
        dstBuf->frameWidth  = srcBuf->frameWidth;
        dstBuf->frameHeight = srcBuf->frameHeight;
        dstBuf->linkPrivate = srcBuf->linkPrivate;
        dstBuf->timeStamp   = srcBuf->timeStamp;
        dstBuf->framePitch[0] = srcBuf->framePitch[0];
        dstBuf->framePitch[1] = srcBuf->framePitch[1];
    
        FILERD_TRACE_FXN_EXIT("VidFrameInfo:"
                             "virt[0][0]:%p,"
                             "phy[0][0]:%p,"
                             "channelNum:%d,"
                             "fid:%d,"
                             "frameWidth:%d,"
                             "frameHeight:%d,"
                             "framePitch[0]:%d,"
                             "framePitch[1]:%d,"
                             "timeStamp:%d,",
                             dstBuf->addr[0][0],
                             dstBuf->phyAddr[0][0],
                             dstBuf->channelNum,
                             dstBuf->fid,
                             dstBuf->frameWidth,
                             dstBuf->frameHeight,
                             dstBuf->framePitch[0],
                             dstBuf->framePitch[1],
                             dstBuf->timeStamp);
    }
    
    
    Int32 fileRd_getEmptyVideoFrames(VIDEO_FRAMEBUF_LIST_S *pFrameBufList, UInt32 timeout)
    {
        VIDFrame_BufList  vidBufList;
        VIDFrame_Buf     *pInBuf;
        VIDEO_FRAMEBUF_S *pOutBuf;
        UInt32 i;
    
        FILERD_TRACE_FXN_ENTRY();
        pFrameBufList->numFrames = 0;
        vidBufList.numFrames = 0;
        IpcFramesOutLink_getEmptyVideoFrames(SYSTEM_HOST_LINK_ID_IPC_FRAMES_OUT_0,
                                             &vidBufList);
    
        pFrameBufList->numFrames = vidBufList.numFrames;
        for (i = 0; i < vidBufList.numFrames; i++)
        {
            pOutBuf = &pFrameBufList->frames[i];
            pInBuf = &vidBufList.frames[i];
    
            fileRd_copyVidFrameInfoLink2McFw(pOutBuf,pInBuf);
        }
    
        FILERD_TRACE_FXN_EXIT("NumFrames Received:%d",pFrameBufList->numFrames);
        return 0;
    }
    
    
    static Void fileRd_copyVidFrameInfoMcFw2Link(VIDFrame_Buf *dstBuf,
                                               VIDEO_FRAMEBUF_S    *srcBuf)
    {
        Int i,j;
        OSA_assert(VIDEO_MAX_FIELDS == VIDFRAME_MAX_FIELDS);
        OSA_assert(VIDEO_MAX_PLANES == VIDFRAME_MAX_PLANES);
    
        for (i = 0; i < VIDEO_MAX_FIELDS; i++)
        {
            for (j = 0; j < VIDEO_MAX_PLANES; j++)
            {
                dstBuf->addr[i][j] = srcBuf->addr[i][j];
                dstBuf->phyAddr[i][j] = srcBuf->phyAddr[i][j];
            }
        }
        dstBuf->channelNum  = srcBuf->channelNum;
        dstBuf->fid         = srcBuf->fid;
        dstBuf->frameWidth  = srcBuf->frameWidth;
        dstBuf->frameHeight = srcBuf->frameHeight;
        dstBuf->linkPrivate = srcBuf->linkPrivate;
        dstBuf->timeStamp   = srcBuf->timeStamp;
        dstBuf->framePitch[0] = srcBuf->framePitch[0];
        dstBuf->framePitch[1] = srcBuf->framePitch[1];
    
        FILERD_TRACE_FXN_EXIT("VidFrameInfo:"
                             "virt[0][0]:%p,"
                             "phy[0][0]:%p,"
                             "channelNum:%d,"
                             "fid:%d,"
                             "frameWidth:%d,"
                             "frameHeight:%d,"
                             "framePitch[0]:%d,"
                             "framePitch[1]:%d,"
                             "timeStamp:%d,",
                             dstBuf->addr[0][0],
                             dstBuf->phyAddr[0][0],
                             dstBuf->channelNum,
                             dstBuf->fid,
                             dstBuf->frameWidth,
                             dstBuf->frameHeight,
                             dstBuf->framePitch[0],
                             dstBuf->framePitch[1],
                             dstBuf->timeStamp);
    }
    
    /**
        \brief Send filled video gBuffers to framework for algo to process
    
    */
    Int32 fileRd_putFullVideoFrames(VIDEO_FRAMEBUF_LIST_S *pFrameBufList)
    {
        VIDEO_FRAMEBUF_S *pSrcBuf;
        VIDFrame_Buf     *pDstBuf;
        VIDFrame_BufList  vidBufList;
        UInt32 i;
        Int status = 0;
    
        FILERD_TRACE_FXN_ENTRY("Num bufs put:%d",pFrameBufList->numFrames);
        vidBufList.numFrames = pFrameBufList->numFrames;
        for (i = 0; i < vidBufList.numFrames; i++)
        {
            pSrcBuf = &pFrameBufList->frames[i];
            pDstBuf = &vidBufList.frames[i];
            fileRd_copyVidFrameInfoMcFw2Link(pDstBuf,pSrcBuf);
        }
        if (vidBufList.numFrames)
        {       
    #if 0
            status =
            IpcFramesOutLink_putFullVideoFrames(gVdisModuleContext.ipcFramesOutHostId,
                                                &vidBufList);
    #else
       status =
       IpcFramesOutLink_putFullVideoFrames(SYSTEM_HOST_LINK_ID_IPC_FRAMES_OUT_0,
                                           &vidBufList);
    #endif
        }
        FILERD_TRACE_FXN_ENTRY("VIDFrame release status:%d",status);
        return 0;
    }
    
    static Void IpcFramesFillBufInfo(VIDEO_FRAMEBUF_LIST_S *bufList,
                                     AIS_fileRD_IpcFramesBufObj  frameObj[],
                                     UInt32  numFrames)//frameObj num
    {
        Int i,j;
       //UInt64 curTimeStamp = 0;
        bufList->numFrames = 0;
       for(i = 0; i < numFrames; i++)
       {
          if (frameObj[i].refCnt == 0)
            {   
             for(j=0; j<frameObj[i].numCh; j++)
             {
                // Send the same buffer to all enabled channels
                bufList->frames[bufList->numFrames].addr[0][0] =  frameObj[i].bufVirt;
                bufList->frames[bufList->numFrames].phyAddr[0][0] = (Ptr)frameObj[i].bufPhy;
                
                bufList->frames[bufList->numFrames].addr[0][1] =  frameObj[i].bufVirt + frameObj[i].curWidth * frameObj[i].curHeight;
                bufList->frames[bufList->numFrames].phyAddr[0][1] = (Ptr)frameObj[i].bufPhy + frameObj[i].curWidth * frameObj[i].curHeight;
                
                bufList->frames[bufList->numFrames].frameWidth =  frameObj[i].curWidth;
                bufList->frames[bufList->numFrames].frameHeight =  frameObj[i].curHeight;
                bufList->frames[bufList->numFrames].fid = VIDEO_FID_TYPE_FRAME;
                bufList->frames[bufList->numFrames].channelNum = i;
                bufList->frames[bufList->numFrames].framePitch[0] = frameObj[i].curWidth;
                bufList->frames[bufList->numFrames].framePitch[1] = frameObj[i].curWidth;
                bufList->frames[bufList->numFrames].framePitch[2] = 0;
                bufList->frames[bufList->numFrames].timeStamp  = 0;
                bufList->numFrames++;
             }
             frameObj[i].refCnt = 1;
            }   
       }
       
    }
    
    
    static Void  IpcFramesInitFrame(AIS_fileRD_IpcFramesBufObj *frameObj,
                             Vsys_AllocBufInfo        *bufInfo,
                             UInt32                   maxWidth,
                             UInt32                   maxHeight)
    {
        frameObj->bufPhy  = (UInt32)bufInfo->physAddr;
        frameObj->bufVirt = bufInfo->virtAddr;
        frameObj->maxWidth = maxWidth;
        frameObj->maxHeight = maxHeight;
        frameObj->curWidth = maxWidth;
        frameObj->curHeight = maxHeight;
        frameObj->numCh     = 1;
        //frameObj->numCh     = ENC_CH;
        frameObj->refCnt    = 0;
    }
    
    static Void IpcFramesInitFrameObj(AIS_fileRD_IpcFramesBufObj frameObj[],UInt32 numFrames)
    {
        Int i;
        Vsys_AllocBufInfo bufInfo;
        Int32 status;
        UInt32 frameSize;
        UInt32 minChWidth = FRWIDTH ;
        UInt32 minChHeight = FRHEIGHT;
       
       
        frameSize = MCFW_IPCBITS_GET_YUV420_FRAME_SIZE(minChWidth,minChHeight);
        memset(frameObj,0,sizeof(AIS_fileRD_IpcFramesBufObj) * numFrames);
        for (i = 0; i < numFrames;i++)
        {
            status = Vsys_allocBuf(MCFW_IPCFRAMES_SRID,frameSize,MCFW_IPCFRAMES_FRAME_BUF_ALIGN,&bufInfo);
            if (ERROR_NONE == status)
            {
                IpcFramesInitFrame(&frameObj[i],&bufInfo,minChWidth, minChHeight);
            }
        }
    }
    
    static Void  IpcFramesDeInitFrameObj(AIS_fileRD_IpcFramesBufObj frameObj[],
                                         UInt32 numFrames)
    {
        Int i;
        Int status;
    
        for (i = 0; i < numFrames;i++)
        {
            if (frameObj[i].bufVirt)
            {
                status =
                Vsys_freeBuf(MCFW_IPCFRAMES_SRID,
                             frameObj[i].bufVirt,
                             MCFW_IPCBITS_GET_YUV420_FRAME_SIZE(frameObj[i].maxWidth,
                                                                frameObj[i].maxHeight));
                OSA_assert(status ==0);
            }
        }
    }
    
    static Void   IpcFramesFreeFrameBuf(VIDEO_FRAMEBUF_LIST_S *bufList,
                               AIS_fileRD_IpcFramesBufObj      frameObj[],
                               UInt32  numFrames)
    {
        Int i,j;
    
        for (i = 0; i < bufList->numFrames; i++)
        {
            for (j = 0; j < numFrames; j++)
            {
                if (frameObj[j].bufPhy ==(UInt32)bufList->frames[i].phyAddr[0][0])
                {
                OSA_assert(frameObj[j].refCnt > 0); 
                    frameObj[j].refCnt--;
                }    
            }
        }
    }
    
    
    void *FramesRdSendFxn(Void * prm)
    {
       VIDEO_FRAMEBUF_LIST_S bufList;
       Int status;
       int i = 0;
    
       printf("Start FramesRdSendFxn()\n");
    
       OSA_semWait(&gReadYUVConfig.thrStartSem,OSA_TIMEOUT_FOREVER);
    
       printf("OSA_semWait() \n");
       
       IpcFramesInitFrameObj(frmObj, ENC_CH);//ENC_CH
    
       printf("FramesRdSendFxn() - Test 1\n");
       
       while (FALSE == gReadYUVConfig.thrExit)
       {
          printf("FramesRdSendFxn() - Test 2\n");
    
          IpcFramesFillBufInfo(&bufList,frmObj, ENC_CH);//ENC_CH
          printf("FramesRdSendFxn() - Test 3\n");
    
          if(bufList.numFrames)
          {
              //printf("0rfcnt is %d\n", frmObj[1].refCnt);
              printf("Start File read\n");
              
             for(i=0; i<ENC_CH; i++)
             {
                fread((unsigned char *)(bufList.frames[i].addr[0][0]), 1, gReadYUVConfig.width * gReadYUVConfig.height, gReadYUVConfig.fin[i]);
                fread((unsigned char *)(bufList.frames[i].addr[0][1]), 1, gReadYUVConfig.width * gReadYUVConfig.height * 0.5, gReadYUVConfig.fin[i]);
                if(feof(gReadYUVConfig.fin[i]))
                {
                   fseek(gReadYUVConfig.fin[i], 0, SEEK_SET);
                }
             }
             printf("feed buflist to encode\n");
             status = fileRd_putFullVideoFrames(&bufList);
                OSA_assert(0 == status);
          }
          else
              printf("fill buffer info failed\n");
              
    
          OSA_waitMsecs(33);
            
          status =  fileRd_getEmptyVideoFrames(&bufList,0);
          OSA_assert(0 == status);
          if (bufList.numFrames)
          {
             IpcFramesFreeFrameBuf(&bufList,frmObj, ENC_CH);//ENC_CH
             //printf("1rfcnt is %d\n", frmObj[0].refCnt);
          }
           else
             printf("get empty video frames failed\n");
           
       }
    
       printf("End FramesRdSendFxn()\n");
    
       return NULL;   
    }
    
    
    char *infilename[8] = {"/home/root/egtest01_640x480.yuv", "2sp.yuv", "3sp.yuv", "3sp.yuv", "2sp.yuv", "1sp.yuv", "1sp.yuv", "2sp.yuv"};
    
    Void ReadYUVInit()
    {
       int status;
       int i;
    
       printf("Start ReadYUVInit()\n");   
       
       memset(&gReadYUVConfig,0,sizeof(gReadYUVConfig));
       for(i=0; i<ENC_CH; i++)
       {
          gReadYUVConfig.fin[i] = fopen(infilename[i],"rb");
          if(gReadYUVConfig.fin[i] == NULL)
          {
             printf("Can't open yuv file %d\n", i);
          }
       }
       gReadYUVConfig.fileNum = ENC_CH;
       gReadYUVConfig.height = FRHEIGHT;
       gReadYUVConfig.width = FRWIDTH;
    
       gReadYUVConfig.thrExit = FALSE;
    
       status = OSA_semCreate(&gReadYUVConfig.thrStartSem,10,0);
        OSA_assert(status==OSA_SOK);
    
        status = OSA_thrCreate(&gReadYUVConfig.thrHandle,
                FramesRdSendFxn,
                MCFW_IPCFRAMES_SENDFXN_TSK_PRI,
                MCFW_IPCFRAMES_SENDFXN_TSK_STACK_SIZE,
                &gReadYUVConfig);
    
        OSA_assert(status==OSA_SOK);
    
       printf("End ReadYUVInit()\n");   
        
       return ;
    }
    
    Void ReadYUVdeInit()
    {   
       gReadYUVConfig.thrExit = TRUE;
       OSA_waitMsecs(1000);
       if(gReadYUVConfig.fin != NULL)
          fclose(gReadYUVConfig.fin[0]);
    
       IpcFramesDeInitFrameObj(frmObj, 0);
    
       return ;
    }
    
    ///////////FILE IO end//////////////
    
    
    #include "multich_common.h"
    
     
    static SystemVideo_Ivahd2ChMap_Tbl systemVid_encDecIvaChMapTbl =
    {
        .isPopulated = 1,
        .ivaMap[0] =
        {
            .EncNumCh  = 2,
            .EncChList = {0, 2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 , 0, 0},
            .DecNumCh  = 0,
            .DecChList = {0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0},
        },
        .ivaMap[1] =
        {
            .EncNumCh  = 2,
            .EncChList = {1, 3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 , 0, 0},
            .DecNumCh  = 0,
            .DecChList = {0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0},
        },
        .ivaMap[2] =
        {
            .EncNumCh  = 8,
            .EncChList = {4, 5, 6, 7, 8, 9, 10, 11, 0, 0, 0, 0, 0, 0 , 0, 0},
            .DecNumCh  = 0,
            .DecChList = {0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0},
        },
    };
    
    
    #define MAX_DEC_OUT_FRAMES_PER_CH                 (5)
    #define FILE_SD_IPCFRAMEEXPORT_FRAME_WIDTH        (320)
    #define FILE_SD_IPCFRAMEEXPORT_FRAME_HEIGHT       (240)
    
    
    static Void SetIpcFramesOutInQueInfo(System_LinkQueInfo *inQueInfo)
    {
        Int i;
    
        inQueInfo->numCh = gVencModuleContext.vencConfig.numPrimaryChn;
        for (i = 0; i < inQueInfo->numCh; i++)
        {
            inQueInfo->chInfo[i].bufType    = SYSTEM_BUF_TYPE_VIDFRAME;
            inQueInfo->chInfo[i].dataFormat = SYSTEM_DF_YUV420SP_UV;
            inQueInfo->chInfo[i].memType    = SYSTEM_MT_NONTILEDMEM;
            inQueInfo->chInfo[i].scanFormat = SYSTEM_SF_PROGRESSIVE;
            inQueInfo->chInfo[i].startX     = 0;
            inQueInfo->chInfo[i].startY     = 0;
            inQueInfo->chInfo[i].width      = FILE_SD_IPCFRAMEEXPORT_FRAME_WIDTH;
            inQueInfo->chInfo[i].height     = FILE_SD_IPCFRAMEEXPORT_FRAME_HEIGHT;
            inQueInfo->chInfo[i].pitch[0]   = FILE_SD_IPCFRAMEEXPORT_FRAME_WIDTH;
            inQueInfo->chInfo[i].pitch[1]   = FILE_SD_IPCFRAMEEXPORT_FRAME_WIDTH;        
            inQueInfo->chInfo[i].pitch[2]   = 0;
        }
    }
    
    Void VPU_Create_AIS_Main_Demo()
    {
       IpcFramesOutLinkHLOS_CreateParams   ipcFramesOutHostPrm;
       IpcFramesOutLinkRTOS_CreateParams   ipcFramesOutVpssPrm[2];
       IpcFramesInLinkRTOS_CreateParams    ipcFramesInDspPrm[2];
       AlgLink_CreateParams                dspAlgPrm[2];
       IpcLink_CreateParams                ipcOutVpssPrm;
       DisplayLink_CreateParams            displayPrm[2];
       
       //Bool   enableOsdAlgLink           = gVsysModuleContext.vsysConfig.enableOsd;
       //Bool   enableTIVA_AlgLink         = gVsysModuleContext.vsysConfig.enableStab;
       //Bool   enableScdAlgLink           = gVsysModuleContext.vsysConfig.enableScd;
       UInt32 noOfDSPAlgo = 1;
       UInt32 i;
    
       MULTICH_INIT_STRUCT(IpcFramesOutLinkHLOS_CreateParams, ipcFramesOutHostPrm);
    
       for (i = 0; i < noOfDSPAlgo;i++)
       {   
          MULTICH_INIT_STRUCT(IpcFramesOutLinkRTOS_CreateParams,ipcFramesOutVpssPrm[i]);
          MULTICH_INIT_STRUCT(IpcFramesInLinkRTOS_CreateParams,ipcFramesInDspPrm[i]);
          MULTICH_INIT_STRUCT(AlgLink_CreateParams, dspAlgPrm[i]);
       }
    
       MULTICH_INIT_STRUCT(IpcLink_CreateParams           ,ipcOutVpssPrm);
    
       for (i = 0; i < VDIS_DEV_MAX; i++)
       {
          MULTICH_INIT_STRUCT(DisplayLink_CreateParams,displayPrm[i]);
       }
        
       MultiCh_detectBoard();
    
       System_linkControl(SYSTEM_LINK_ID_M3VPSS,SYSTEM_M3VPSS_CMD_RESET_VIDEO_DEVICES,NULL,0,TRUE);
    
       System_linkControl(SYSTEM_LINK_ID_M3VIDEO,SYSTEM_COMMON_CMD_SET_CH2IVAHD_MAP_TBL,&systemVid_encDecIvaChMapTbl,
             sizeof(SystemVideo_Ivahd2ChMap_Tbl),TRUE);
       
       gVdisModuleContext.ipcFramesOutHostId     = SYSTEM_HOST_LINK_ID_IPC_FRAMES_OUT_0;
       gVcapModuleContext.ipcFramesOutVpssId[0]  = SYSTEM_VPSS_LINK_ID_IPC_FRAMES_OUT_0;   
       gVcapModuleContext.ipcFramesInDspId[0]    = SYSTEM_DSP_LINK_ID_IPC_FRAMES_IN_0;
       gVcapModuleContext.dspAlgId[0]            = SYSTEM_LINK_ID_ALG_0  ;
       gVdisModuleContext.displayId[2]           = SYSTEM_LINK_ID_DISPLAY_2; 
       
       // 1. ����IpcFramesOutLinkHLOS�IJ���
       ipcFramesOutHostPrm.baseCreateParams.noNotifyMode              = FALSE;
       ipcFramesOutHostPrm.baseCreateParams.notifyNextLink            = TRUE;
       ipcFramesOutHostPrm.baseCreateParams.notifyPrevLink            = FALSE;
       ipcFramesOutHostPrm.baseCreateParams.numOutQue                 = 1;
       ipcFramesOutHostPrm.baseCreateParams.outQueParams[0].nextLink  = gVcapModuleContext.ipcFramesOutVpssId[0];
       SetIpcFramesOutInQueInfo(&ipcFramesOutHostPrm.inQueInfo);
    
       ipcFramesOutVpssPrm[0].baseCreateParams.inQueParams.prevLinkId       = gVdisModuleContext.ipcFramesOutHostId;
       ipcFramesOutVpssPrm[0].baseCreateParams.inQueParams.prevLinkQueId    = 0;
       ipcFramesOutVpssPrm[0].baseCreateParams.notifyPrevLink               = TRUE;
       
       ipcFramesOutVpssPrm[0].baseCreateParams.numOutQue                 = 1;
       ipcFramesOutVpssPrm[0].baseCreateParams.outQueParams[0].nextLink  = gVdisModuleContext.displayId[2];
       ipcFramesOutVpssPrm[0].baseCreateParams.notifyNextLink            = TRUE;
       
       ipcFramesOutVpssPrm[0].baseCreateParams.processLink               = gVcapModuleContext.ipcFramesInDspId[0];
       ipcFramesOutVpssPrm[0].baseCreateParams.notifyProcessLink         = TRUE;
       ipcFramesOutVpssPrm[0].baseCreateParams.noNotifyMode              = FALSE;
    
    
       ipcFramesInDspPrm[0].baseCreateParams.inQueParams.prevLinkId      = gVcapModuleContext.ipcFramesInDspId[0];
       ipcFramesInDspPrm[0].baseCreateParams.inQueParams.prevLinkQueId   = 0;
       ipcFramesInDspPrm[0].baseCreateParams.numOutQue                   = 1;
       ipcFramesInDspPrm[0].baseCreateParams.outQueParams[0].nextLink    = gVcapModuleContext.dspAlgId[0];
       ipcFramesInDspPrm[0].baseCreateParams.notifyPrevLink              = TRUE;
       ipcFramesInDspPrm[0].baseCreateParams.notifyNextLink              = TRUE;
       ipcFramesInDspPrm[0].baseCreateParams.noNotifyMode                = FALSE;
       
       dspAlgPrm[0].inQueParams.prevLinkId       = gVcapModuleContext.ipcFramesInDspId[0];
       dspAlgPrm[0].inQueParams.prevLinkQueId    = 0;
       dspAlgPrm[0].enableRunBuff                = FALSE;
       dspAlgPrm[0].standard                     = SYSTEM_STD_D1;
    
       
       dspAlgPrm[0].tivaCreateParams.maxWidth          = ALG_LINK_TIVA_MAX_FRAME_WIDTH;
       dspAlgPrm[0].tivaCreateParams.maxHeight         = ALG_LINK_TIVA_MAX_FRAME_HEIGHT;
       dspAlgPrm[0].tivaCreateParams.maxStride         = ALG_LINK_TIVA_MAX_FRAME_WIDTH;
       dspAlgPrm[0].tivaCreateParams.useStab           = 0;
       dspAlgPrm[0].tivaCreateParams.stabProcessLevel  = ALG_LINK_TIVA_PROCESS_LEVEL;
       dspAlgPrm[0].tivaCreateParams.motionModelType   = 4;//ALG_LINK_MMODEL_AFFINE;
       dspAlgPrm[0].tivaCreateParams.vs_ver_crop_size  = 0;
       dspAlgPrm[0].tivaCreateParams.vs_hor_crop_size  = 0;
       
       dspAlgPrm[0].tivaCreateParams.useTraker = TRUE;
       dspAlgPrm[0].tivaCreateParams.tRoiX     = 200;
       dspAlgPrm[0].tivaCreateParams.tRoiY     = 200;
       dspAlgPrm[0].tivaCreateParams.tRoiW     = 20;
       dspAlgPrm[0].tivaCreateParams.tRoiH     = 20;
    
       displayPrm[2].numInputQueues                 = 1;
       displayPrm[2].activeQueue                    = 0;
       displayPrm[2].inQueParams[0].prevLinkId      = gVcapModuleContext.ipcFramesOutVpssId[0];
       displayPrm[2].inQueParams[0].prevLinkQueId   = 0;
       displayPrm[2].displayRes                     = VSYS_STD_NTSC;
       
    
       //if(enableOsdAlgLink || enableScdAlgLink || enableTIVA_AlgLink)
       {
       //printf("enableOsdAlgLink : %d\n", gVsysModuleContext.vsysConfig.enableOsd);
          for(i=0;i<noOfDSPAlgo;i++)
          {
             System_linkCreate(gVdisModuleContext.ipcFramesOutHostId, &ipcFramesOutHostPrm, sizeof(ipcFramesOutHostPrm));
             System_linkCreate(gVcapModuleContext.ipcFramesOutVpssId[i], &ipcFramesOutVpssPrm[i], sizeof(ipcFramesOutVpssPrm[i]));
             System_linkCreate(gVcapModuleContext.ipcFramesInDspId[i], &ipcFramesInDspPrm[i], sizeof(ipcFramesInDspPrm[i]));
             System_linkCreate(gVcapModuleContext.dspAlgId[i] , &dspAlgPrm[i], sizeof(dspAlgPrm[i]));
          }
       }
       
       System_linkCreate(gVdisModuleContext.displayId[2], &displayPrm[2], sizeof(displayPrm[2]));
    
       MultiCh_memPrintHeapStatus();    
    }
    
    Void VPU_Delete_AIS_Main_Demo()
    {
       Bool   enableOsdAlgLink = gVsysModuleContext.vsysConfig.enableOsd;
       Bool   enableTIVA_AlgLink = gVsysModuleContext.vsysConfig.enableStab;
       Bool   enableScdAlgLink = gVsysModuleContext.vsysConfig.enableScd;
       UInt32 noOfDSPAlgo = 1;
       Int32 i;
    
       gVdisModuleContext.displayId[2] = SYSTEM_LINK_ID_DISPLAY_2; // OFF CHIP HDMI
    
       if(gVsysModuleContext.vsysConfig.enableOsd)
       {
          System_linkDelete(gVcapModuleContext.dspAlgId[0]);
       }
    
       System_linkDelete(gVdisModuleContext.ipcFramesOutHostId);
    
       if(enableOsdAlgLink || enableScdAlgLink || enableTIVA_AlgLink)
       {
          for(i=0;i<noOfDSPAlgo;i++)
          {
             System_linkDelete(gVcapModuleContext.ipcFramesOutVpssId[i] );
             System_linkDelete(gVcapModuleContext.ipcFramesInDspId[i]);
          }
       }
    
       System_linkDelete(gVdisModuleContext.displayId[2]);
    
       /* Print the HWI, SWI and all tasks load */
       /* Reset the accumulated timer ticks */
       MultiCh_prfLoadCalcEnable(FALSE, TRUE, FALSE);
    }
    
    
    
    
    
    /******************************************************************************
    * Copyright (c) 2013 eInfochips - All rights reserved.
    *
    * This software is authored by eInfochips and is eInfochips' intellectual
    * property, including the copyrights in all countries in the world.
    * This software is provided under a license to use only with all other rights,
    * including ownership rights, being retained by eInfochips.
    *
    * This file may not be distributed, copied, or reproduced in any manner,
    * electronic or otherwise, without the written consent of eInfochips.
    *****************************************************************************/
    
    /******************************************************************************/
    /*! \file   ais_main_demo.c
     *
     *  \brief  This file has functional APIs for initialization, deinitialization and
     *       processing thread \n
     *                             \n
     *          SCD is disabled explicitly because TI's SCD algorithm works with  CIF only,
     *          however this application supports D1 and HD.
     *          Refer SCD demo in order to integrate AIS's SCD algorithm  in this application.
     *
     *
     * History: \n
     *          MAR/13/2013, eInfochips,   Created the file \n
     *          APR/08/2013, eInfochips,   Started adding support of meta data with recording. \n
     *          APR/16/2013, eInfochips,   Added support of recording based on camera switching. \n
     *          APR/21/2013, eInfochips,   Added support of keeping some storage as buffer space,
     *                                     due to this recording will not be stoped unexpectedly.\n
     *          MAY/02/2013, eInfochips,   Solved issue of restarting recording after enough storage is available. \n
     *          MAY/16/2013, eInfochips,   Added restriction of recording, it will only done on SD card. \n
     *          \n
     */
    /******************************************************************************/
    #include <vpu_demo.h>
    #include "demo_scd_bits_wr.h"
    #include "vpu_get_files.h"
    #include "vpu_status_report.h"
    
    #include "vpu_streamer_cfg.h"
    #include "vpu_utils.h"
    #include "demo_running_buff.h"
    #include <signal.h>
    
    #include "ais_fileRd.h"
    
    /*! \var gRecStrm_ctrl 
        \brief  Global structure containing recorder streamer and other configurations. 
    */
    VrecVstrm_Ctrl gRecStrm_ctrl;
    
    UInt16 frameStreamCount;
    UInt16 prevFrameCount;
    UInt16 framerate = 0;
    
    /*! \var gRunBufInfo
        \brief A global variable for running buffer support.
    */
    Running_Buffer_Info gRunBufInfo = {
        .pPhysicalAddr  = NULL,
        .pVirtualAddr   = NULL,
        .totalSize      = 0
    };
    
    /*******************************************************************************/
    /*! \fn     Int32 StreamingFPS (U8 *fps)
     *
     *   \brief This function get VPU camera capture FPS.
     *
     *   \param  *fps - writes FPS of current streaming camera.
     *
     * \return  I32, VPU_OK on success, otherwise error code will be returned(as in vpu_values.h)..
     *   
     */
    /*******************************************************************************/
    Int32 StreamingFPS (U8 *fps)
    {
       if(fps != NULL)
          *fps = (U8)framerate;
       
       return 0;
    }
    
    
    /*******************************************************************************/
    /*! \fn     Int32 AIS_Main_Demo_resetStats
     *
     *   \brief Resets statistics of demo
    
     *   \param None
     *
     *   \return VPU_OK
     */
    /*******************************************************************************/
    Int32 AIS_Main_Demo_resetStats()
    {
        UInt32 chId;
        VcapVenc_ChInfo *pChInfo;
    
        for(chId=0; chId<VENC_CHN_MAX; chId++)
        {
            pChInfo = &gRecStrm_ctrl.chInfo[chId];
    
            pChInfo->totalDataSize = 0;
            pChInfo->numKeyFrames = 0;
            pChInfo->numFrames = 0;
            pChInfo->maxWidth = 0;
            pChInfo->minWidth = 0xFFF;
            pChInfo->maxHeight= 0;
            pChInfo->minHeight= 0xFFF;
            pChInfo->maxLatency= 0;
            pChInfo->minLatency= 0xFFF;
    
        }
    
        gRecStrm_ctrl.statsStartTime = OSA_getCurTimeInMsec();
        return VPU_OK;
    }
    
    /*******************************************************************************/
    /*! \fn     Void *AIS_Monitor_Thread (Void *pPrm)
     *
     *   \brief This is the thread to monitor the available storage space.
     *
     *   \param Void *pPrm: Unused
     *
     *   \return NULL when the thread is exited 
     */
    /*******************************************************************************/
    Void *AIS_Monitor_Thread (Void *pPrm)
    {
       I32   status = VPU_OK;
       U64 spaceLeft = 0;
       Bool restartRecIfSpacAvail = FALSE;
       UINT32 tmpMinSizeInBytes = 10485760;  /* 10MB */
       I32 recStoped = 0;
    
       printf ("AIS_Monitor_Thread created\n");
    
       while (!gRecStrm_ctrl.exitRecStrmTd)
       {
          /* Get free space from SD card */
    
          if ( (VPU_DOWNLOAD_RUNNING == RecordingDownloadStatus(-1)) || 
              (VPU_CLEAR_RUNNING == RecordingClearStatus(-1)) || 
              ((gRecStrm_ctrl.autoRecFlag != VPU_REC_AUTO_START) && 
                (gRecStrm_ctrl.prevRecCmdCfg.recording != VPU_START_REC) ) )
          {
             sleep(1);
             continue;
          }
       
          status = GetFreeSpace(VPU_SD_MOUNT_PT,&spaceLeft);
          if (status != VPU_OK)
          {
             printf ("Error [%d] in getting free space at '%s' location \n", status,
                      VPU_SD_MOUNT_PT);
    
             restartRecIfSpacAvail = FALSE;
             if(!recStoped)
             {
                if (gRecStrm_ctrl.recordTelemetryOnly == TRUE)
                {
                   AP_VPU_RecordTelemetryOnly (FALSE);
                   gRecStrm_ctrl.recordTelemetryOnly = TRUE;
                }
                else
                {
                    /* If configured for both channel recording,then stop both channels */
                   if (gRecStrm_ctrl.recordTelemetryOnly == FALSE)
                   {
                      AP_VPU_StartStopRecording (RECR_A, FALSE);
                        AP_VPU_StartStopRecording (RECR_B, FALSE);
                   }                                
                    /* If configured for selected channel recording,then stop selected channel */
                    else
                   {
                      /* If selected channel is B, then stop recording of B */
                        if (gRecStrm_ctrl.activeChn == CHANNEL_B)
                        {
                           AP_VPU_StartStopRecording (RECR_B, FALSE);
                       }
                       /* If selected channel is A, then stop recording of A */
                      else
                       {       
                            AP_VPU_StartStopRecording (RECR_A, FALSE);
                        }            
                    }
                }
                recStoped = 1;    
             }
               printf ("[VPU INFO] SD card not mounted or inserted, Recording can not be started without SD storage\n");
              sleep(3);
             continue;
          }
          else
            {
             if(recStoped)
                 restartRecIfSpacAvail = TRUE;
             recStoped = 0;
          }
    
          /* If enough space is not available for storage of MPEG2TS file */
          if (spaceLeft <= tmpMinSizeInBytes)
          {
             printf ("Free space %llu : Margin: %d \n", spaceLeft, tmpMinSizeInBytes);
             restartRecIfSpacAvail = TRUE; 
    
             /* If video_overflow setting is to stop recording if storage space is full */
             if (gRecStrm_ctrl.prevRecStrmCfg.vid_ovrflw == VPU_STOP_RECORDING)
             {
                printf ("==> Stop recording until space is available .. \n");
    
                if (gRecStrm_ctrl.recordTelemetryOnly == TRUE)
                    {
                        AP_VPU_RecordTelemetryOnly (FALSE);
                   // Keep this preserved because whenever space is available start record telemetry only
                   gRecStrm_ctrl.recordTelemetryOnly = TRUE;
                    }
                else
                {
                   /* If configured for both channel recording,then stop both channels */
                   if (gRecStrm_ctrl.selectedRec == VPU_REC_BOTH)
                   {
                      AP_VPU_StartStopRecording (RECR_A, FALSE);
                      AP_VPU_StartStopRecording (RECR_B, FALSE);
                   }
                   /* If configured for selected channel recording,then stop selected channel */
                   else
                   {
                      /* If selected channel is B, then stop recording of B */
                      if (gRecStrm_ctrl.activeChn == CHANNEL_B)
                      {
                            AP_VPU_StartStopRecording (RECR_B, FALSE);
                      }
                      /* If selected channel is A, then stop recording of A */
                      else
                      {
                             AP_VPU_StartStopRecording (RECR_A, FALSE);
                      }
                   }
                }
             }
             /* If video_overflow setting is to overwrite oldest recording if storage space is full */
             else
             {
                printf ("==> Overwrite files until space is available .. \n");
    
                /* Stop existing recording */
                if (gRecStrm_ctrl.recordTelemetryOnly == TRUE)
                    {
                        AP_VPU_RecordTelemetryOnly (FALSE);
                    }
                else
                {
                   /* If configured for both channel recording,then stop both channels */
                   if (gRecStrm_ctrl.selectedRec == VPU_REC_BOTH)
                     {
                           AP_VPU_StartStopRecording (RECR_A, FALSE);
                          AP_VPU_StartStopRecording (RECR_B, FALSE);
                     }
                   /* If configured for selected channel recording,then stop selected channel */
                      else
                     {
                      /* If selected channel is B, then stop recording of B */
                          if (gRecStrm_ctrl.activeChn == CHANNEL_B)
                         {
                               AP_VPU_StartStopRecording (RECR_B, FALSE);
                          }
                      /* If selected channel is A, then stop recording of A */
                           else
                          {
                                AP_VPU_StartStopRecording (RECR_A, FALSE);
                          }
                       }
                }  
    
                /* Remove oldest file */
                if (DeleteOldestFile (VPU_SD_MOUNT_PT) != VPU_OK)
                {
                   printf ("Error in getting oldest file name \n");
                }
                else
                {
                   /* Since oldest file is deleted, restart recording immediately */
                   continue;
                }
             }
          }
          /* If enough space is available for storage of MPEG2TS file */
          else
          {
             /* If space is available after the storage space was full */
             if (restartRecIfSpacAvail == TRUE)
             {
                printf ("Space is now available so restarting file writing \n");
    
                    if( TRUE == gRecStrm_ctrl.autoRecFlag )
                    {
                        gRecStrm_ctrl.prevRecCmdCfg.recording = VPU_START_REC;
                    }
    
                if (gRecStrm_ctrl.recordTelemetryOnly == TRUE)
                    {
                        AP_VPU_RecordTelemetryOnly (TRUE);
                    }
                else
                {
                   /* If both channel recording was enabled, then start both the channels */
                   if (gRecStrm_ctrl.selectedRec == VPU_REC_BOTH)
                   {
                      AP_VPU_StartStopRecording (RECR_A, TRUE);
                      AP_VPU_StartStopRecording (RECR_B, TRUE);
                   }
                   /* If selected channel recording was enabled, then start selected channel */
                   else
                   {
                      /* If selected channel is B, then start recording of B */
                      if (gRecStrm_ctrl.activeChn == CHANNEL_B)
                      {
                         AP_VPU_StartStopRecording (RECR_B, TRUE);
                      }
                      /* If selected channel is A, then start recording of A */
                      else
                      {
                         AP_VPU_StartStopRecording (RECR_A, TRUE);
                      }
                   }
                }
    
                restartRecIfSpacAvail = FALSE;
             }
          }
          sleep (2);
       }
    
       printf ("Exiting '%s' thread \n", __FUNCTION__);
       return NULL;
    }
    
    /*******************************************************************************/
    /*! \fn     Void AIS_Main_Demo_CbFxn
     *
     *   \brief Registers call back for encoder
    
     *   \param None 
     *
     *   \return None
     */
    /*******************************************************************************/
    Void AIS_Main_Demo_CbFxn()
    {
        OSA_semSignal(&gRecStrm_ctrl.recStrmSem);
    }
    
    VcapVenc_ChInfo *pChInfo;
    
    void AIS_Monitor_Framerate(int signum)
    {
    
       if(prevFrameCount > frameStreamCount)
       {
          framerate = frameStreamCount+65536 - prevFrameCount;
       }
       else
          framerate = frameStreamCount - prevFrameCount;
       
       framerate /= 10;  
       //printf("FrameRate %d\n",framerate);
       prevFrameCount = frameStreamCount;
    
       signal(SIGALRM, AIS_Monitor_Framerate);
       alarm(10);
       return ;
    }
    
    /*******************************************************************************/
    /*! \fn     Void *AIS_Main_Demo_Main(Void *pPrm)
     *
     *   \brief This is the main thread for application, which gets encoded data from
     *       encoder module and sends the frames to recorder and streamer module
    
     *   \param Void *pPrm: Unused
     *
     *   \return NULL when the thread is exited 
     */
    /*******************************************************************************/
    Void *AIS_Main_Demo_Main(Void *pPrm)
    {
        Int32 status, frameId;
        VCODEC_BITSBUF_LIST_S bitsBuf;
        VCODEC_BITSBUF_S *pBuf = NULL;
        VcapVenc_ChInfo *pChInfo;
        UInt32 latency;
       Int32 retVal;
       Int32 tmpChId;
    
       gRecStrm_ctrl.isRecStrmTdStopDone = FALSE;
    
       while(!gRecStrm_ctrl.exitRecStrmTd)
       {
          /* Wait for encoder event to ensure that encoded data is ready */
          status = OSA_semWait(&gRecStrm_ctrl.recStrmSem, OSA_TIMEOUT_FOREVER);
          if(status!=OSA_SOK)
          {
             break;   
          }
    
          /* Get the encoder buffer */  
          status = Venc_getBitstreamBuffer(&bitsBuf, TIMEOUT_NO_WAIT);
    
          if(status==ERROR_NONE && bitsBuf.numBufs)
          {
             for(frameId=0; frameId<bitsBuf.numBufs; frameId++)
             {
                pBuf = &bitsBuf.bitsBuf[frameId];
                if(pBuf->chnId<VENC_CHN_MAX)
                {
                   pChInfo = &gRecStrm_ctrl.chInfo[pBuf->chnId];
    
                   pChInfo->totalDataSize += pBuf->filledBufSize;
                   pChInfo->numFrames++;
                   if(pBuf->frameType==VCODEC_FRAME_TYPE_I_FRAME)
                   {
                      pChInfo->numKeyFrames++;
                   }
    
                   latency = pBuf->encodeTimestamp - pBuf->timestamp;
    
                   if(latency > pChInfo->maxLatency)
                      pChInfo->maxLatency = latency;
    
                   if(latency < pChInfo->minLatency)
                      pChInfo->minLatency = latency;
    
                   if(pBuf->frameWidth > pChInfo->maxWidth)
                      pChInfo->maxWidth = pBuf->frameWidth;
    
                   if(pBuf->frameWidth < pChInfo->minWidth)
                      pChInfo->minWidth = pBuf->frameWidth;
    
                   if(pBuf->frameHeight > pChInfo->maxHeight)
                      pChInfo->maxHeight = pBuf->frameHeight;
    
                   if(pBuf->frameHeight < pChInfo->minHeight)
                      pChInfo->minHeight = pBuf->frameHeight;
                }
    
                /**************************************************
                   RECORDER
                ***************************************************/
                /* If recording is enabled */
                    if (gRecStrm_ctrl.enableFileWrite == TRUE)
                    {
                   /* Based on current encoded event, 
                      identify for which instance it is meant */
                        if (pBuf->chnId == REC_CHNA)
                        {
                            tmpChId = RECR_A;
                        }
                        else if (pBuf->chnId == REC_CHNB)
                        {
                            tmpChId = RECR_B;
                        }
                        else
                        {
                            tmpChId = -1;
                        }
    
                        if (tmpChId != -1)
                        {
                      /* If recording for the particular channel is selected, 
                         then send data to recorder module */
                            if ((gRecStrm_ctrl.fileWriteEnable [tmpChId] == TRUE) || 
                         (gRecStrm_ctrl.recordTelemetryOnly == TRUE))
                            {
                                retVal = SendToRecorder (&gRecStrm_ctrl, pBuf);
                                if (retVal != VPU_OK)
                                {
                            /* Ignore the error and move ahead */
                                }
                            }
                        }
                    }
    
                /****************************************
                   STREAMER
                *****************************************/
                /* If streaming is enabled */
    
                if ((gRecStrm_ctrl.enableStreaming == TRUE) || (gRecStrm_ctrl.hdChnStrm == TRUE))
                {
                   /* If streaming for the particular channel is selected,
                      then send data to streamer module */
                   if (gRecStrm_ctrl.streamEnable[pBuf->chnId] == TRUE)
                   {
                      if( (pBuf->chnId == CHANNEL_B) || (pBuf->chnId == CHANNEL_A) ) 
                      {
                         frameStreamCount++;
                      }
                      retVal = SendToStreamer (&gRecStrm_ctrl, pBuf);
                      if (retVal != VPU_OK)
                      {
                         /* Ignore the error and move ahead */
                      }
                   }
                }
                
             }
             /* Release the encoder buffer */
             Venc_releaseBitstreamBuffer(&bitsBuf);
          }
       }
       gRecStrm_ctrl.isRecStrmTdStopDone = TRUE;
    
       return NULL;
    }
    
    /*******************************************************************************/
    /*! \fn     Void InitDefaults ()
     *
     *   \brief Updates the global handle with necessary parameters for recorder and 
     *       streamer module. This is done to preserve previous configuration, whenever
     *       new configuration comes, it will be compared with older value, if both
     *       are not same, then only perform operation for recorder/streamer.  
     *
     *   \param None
     *
     *   \return Void
     */
    /*******************************************************************************/
    Void InitDefaults ()
    {
       OSA_mutexLock (&gRecStrm_ctrl.recStrmMtxHnd);
       usleep (1000);
    
       gRecStrm_ctrl.deinterlaceFlag = gRecStrm_ctrl.vpuCfgs.trackerCfg.deinterlace; 
       gRecStrm_ctrl.prevTrackerCfg.deinterlace = gRecStrm_ctrl.vpuCfgs.trackerCfg.deinterlace;
    
       memcpy (&(gRecStrm_ctrl.prevRecStrmCfg), &(gRecStrm_ctrl.vpuCfgs.recStrmCfg),          
             sizeof (VPU_RecordStreamConfig));   
    
       if (gRecStrm_ctrl.prevRecStrmCfg.auto_rec == VPU_REC_AUTO_START)
       {
          gRecStrm_ctrl.prevRecCmdCfg.recording = VPU_START_REC; 
       }
       else
       {
          gRecStrm_ctrl.prevRecCmdCfg.recording = VPU_UNKNOWN_START_STOP_REC; 
       }
    
       gRecStrm_ctrl.prevRecCmdCfg.recTelemetry_only = gRecStrm_ctrl.vpuCfgs.recCmdCfg.recTelemetry_only;
    
       usleep (1000);
       memcpy (&(gRecStrm_ctrl.prevSwcCam), &(gRecStrm_ctrl.vpuCfgs.switchCameraCmdCfg), 
             sizeof (VPU_SwitchCamera));   
    
        gRecStrm_ctrl.autoRecFlag = gRecStrm_ctrl.prevRecStrmCfg.auto_rec;
        gRecStrm_ctrl.selectedRec = gRecStrm_ctrl.prevRecStrmCfg.smlt_rec;
    
        gRecStrm_ctrl.autoStrmFlag = (gRecStrm_ctrl.prevRecStrmCfg.auto_stream 
                               == VPU_STRM_AUTO_START)? TRUE: FALSE;
    
       gRecStrm_ctrl.overwriteFlag = (gRecStrm_ctrl.prevRecStrmCfg.vid_ovrflw 
                               == VPU_OVER_WRITE)? TRUE: FALSE;
    
       gRecStrm_ctrl.recordTelemetryOnly = (gRecStrm_ctrl.prevRecCmdCfg.recTelemetry_only 
                               == VPU_REC_TEL_ONLY)? TRUE: FALSE;
    
       gRecStrm_ctrl.activeChn = (gRecStrm_ctrl.prevSwcCam.camera 
                               == VPU_CAMERA_A)? CHANNEL_A: CHANNEL_B;
    
       
       gRecStrm_ctrl.prevStrmCmdCfg.streaming = VPU_UNKNOWN_START_STOP_STREAM;
       gRecStrm_ctrl.prevStrmCmdCfg.hd_streaming = VPU_UNKNOWN_START_STOP_STREAM;
    
       gRecStrm_ctrl.vpuCfgs.strmCmdCfg.streaming = gRecStrm_ctrl.prevStrmCmdCfg.streaming;
       gRecStrm_ctrl.vpuCfgs.strmCmdCfg.hd_streaming = gRecStrm_ctrl.prevStrmCmdCfg.hd_streaming;
    
       gRecStrm_ctrl.vpuCfgs.recCmdCfg.recording = gRecStrm_ctrl.prevRecCmdCfg.recording;
       gRecStrm_ctrl.vpuCfgs.recCmdCfg.recTelemetry_only = gRecStrm_ctrl.prevRecCmdCfg.recTelemetry_only;
     
       usleep (1000);
       OSA_mutexUnlock (&gRecStrm_ctrl.recStrmMtxHnd);
    }
    
    /*******************************************************************************/
    /*! \fn     Int32 Strm_Rec_Create()
     *
     *   \brief Creates recoder(s) and streamer(s) based on default/latest configuration
     *       saved by VPU
     *
     *   \param None
     *
     *   \return RET_ERROR, if error. VPU_OK, if successful
     */
    /*******************************************************************************/
    Int32 Strm_Rec_Create()
    {
       VENC_CALLBACK_S callback;
       Int32 status, errorCode, i;
    
       /* Get IP address of eth0 interface, this is needed for creating RTP/RTSP session */
       if (GetIP ("eth0", (char*)gRecStrm_ctrl.serverIpAddr) < 0)
       {
          printf ("Error in getting IP address of Server \n");
          return RET_ERROR;
       }
    
       OSA_mutexLock (&gRecStrm_ctrl.recStrmMtxHnd);
    
    
       /* If Selected channel is B, then start streaming for camera B */
       if (gRecStrm_ctrl.activeChn == CHANNEL_B)
       {
            gRecStrm_ctrl.chnToStrm = STRM_CHNB;
       }
       /* If Selected channel is A, then start streaming for camera A */
        else if (gRecStrm_ctrl.activeChn == CHANNEL_A)
       {
            gRecStrm_ctrl.chnToStrm = STRM_CHNA;
       }
       else
       {
          printf ("Error : Invalid value of Active channel is received \n");
       }
       
       printf ("====> gRecStrm_ctrl.chnToStrm: %d \n", gRecStrm_ctrl.chnToStrm);
    
       /* If Auto recording is configured */  
       if (gRecStrm_ctrl.autoRecFlag == VPU_REC_AUTO_START)
       {
          /* Get the channel for recording based on selected primary channel */
          if (gRecStrm_ctrl.selectedRec == VPU_REC_SELECTED)
          {
             gRecStrm_ctrl.chnToRec = (gRecStrm_ctrl.activeChn == CHANNEL_B)?
                                              REC_CHNB:REC_CHNA;   
             printf ("Starting with 'Selected Automatic Recording'\n"); 
          }
          /* Both channel recording is enabled */
          else 
          {
             gRecStrm_ctrl.chnToRec = BOTH_RECR;
             gRecStrm_ctrl.fileWriteEnable[RECR_A] = TRUE;
             gRecStrm_ctrl.fileWriteEnable[RECR_B] = TRUE;
             printf ("Starting with 'Simultaneously Automatic Recording' \n"); 
          }
          gRecStrm_ctrl.enableFileWrite = TRUE;
       }
       /* Recording should be started based on user start/stop command */
       else 
       {
          gRecStrm_ctrl.enableFileWrite = FALSE; 
          printf ("Starting with 'No Recording' and recording will be started on user command \n"); 
          /* Check and update channels to be recorded based on selected channel or both channel configuration */
          /* Actual recording will be applicable on this configuration, whenever start command is sent by AP */
          if (gRecStrm_ctrl.selectedRec == VPU_REC_SELECTED)
          {
             gRecStrm_ctrl.chnToRec = (gRecStrm_ctrl.activeChn == CHANNEL_B)?REC_CHNB:REC_CHNA;
          }
          else 
          {
             gRecStrm_ctrl.chnToRec = BOTH_RECR;
          }  
       }
    
    
    
       /* If recording is not started yet, then start recording now */   
       if (gRecStrm_ctrl.recInitDone == FALSE)
       {
          OSA_mutexUnlock (&gRecStrm_ctrl.recStrmMtxHnd);
          /* Start recorder module, which will start MPEG2TS container */
          status = RecInit ();
          OSA_assert(status==OSA_SOK);
    
          OSA_mutexLock (&gRecStrm_ctrl.recStrmMtxHnd);
    
          gRecStrm_ctrl.recInitDone = TRUE;
       }
    
       printf (" ========> gRecStrm_ctrl.chnToRec : %d \n", gRecStrm_ctrl.chnToRec);
    
       /* Create RTP/RTSP session strings for UAV for cameraA and cameraB, UAVH (optional HD streaming) */
    
       // Name of CameraA and CameraB is kept same in order to switch them on same URL
       sprintf ((char*)((gRecStrm_ctrl.sessionName)[UAV_SESSION]), "%s", SESSION_NAME);
    
       /* HD streaming of camera B*/
       sprintf ((char*)((gRecStrm_ctrl.sessionName)[UAV_HD_SESSION]), "%s%c", SESSION_NAME, 'H');
    
       gRecStrm_ctrl.serverPort = DFT_RTSP_SRV_PORT;
       gRecStrm_ctrl.hdChnStrm = FALSE;
       gRecStrm_ctrl.exitRecStrmTd = FALSE;
       gRecStrm_ctrl.isRecStrmTdStopDone = TRUE; 
    
       printf ("Creating server with IP:%s, Port: %d \n", gRecStrm_ctrl.serverIpAddr, gRecStrm_ctrl.serverPort);
    
       /* Call API to start the RTSP Server */
       gRecStrm_ctrl.serverHandle = RTSPServerStart (gRecStrm_ctrl.serverIpAddr, gRecStrm_ctrl.serverPort, 0, &errorCode);
       if (gRecStrm_ctrl.serverHandle == NULL)
       {
          printf ("ERROR: Can not start the RTSP server \n");
          OSA_mutexUnlock (&gRecStrm_ctrl.recStrmMtxHnd);
          return RET_ERROR;
       }
       errorCode = 0;
       printf ("Server Started with handle '%x' \n", (Int32) gRecStrm_ctrl.serverHandle);
    
       /* If auto streaming is started, then enable streaming */
       if (gRecStrm_ctrl.autoStrmFlag == TRUE)
       {
          gRecStrm_ctrl.enableStreaming = TRUE;
       }
       /* don't start streaming, as automatic streaming is not enabled */
       else
       {
          gRecStrm_ctrl.enableStreaming = FALSE;
       }
    
       /* If SD streaming needs to be started, then create RTSP session */
       if (gRecStrm_ctrl.enableStreaming == TRUE)
       {
          i = UAV_SESSION;
          printf ("Creating session with session name : %s \n", ((char*)(gRecStrm_ctrl.sessionName[i])));
    
          /* Call API to create RTSP session in Server */
          gRecStrm_ctrl.sessionHandle[i] = RTSPSessionCreate (gRecStrm_ctrl.serverHandle,
             ((BYTE *)(gRecStrm_ctrl.sessionName[i])), (BYTE *)NULL, (SERVER_CALLBACK_FUNC)NULL,
             0, 0, USER_NAME, PASSWORD, 0, &errorCode);
          if (gRecStrm_ctrl.sessionHandle[i] == NULL)
          {
             printf ("ERROR: Can not create the RTSP session \n");
             OSA_mutexUnlock (&gRecStrm_ctrl.recStrmMtxHnd);
             return RET_ERROR;
          }
          errorCode = 0;
    
          printf ("Session '%s' created with handle '%x'\n", 
             ((char*)(gRecStrm_ctrl.sessionName)[i]), (Int32)gRecStrm_ctrl.sessionHandle[i]);
    
          gRecStrm_ctrl.streamAdded[i] = FALSE;
          gRecStrm_ctrl.strmStatus[i] = STRM_INIT;
          gRecStrm_ctrl.streamEnable[gRecStrm_ctrl.chnToStrm] = TRUE;
       }
    
       callback.newDataAvailableCb = AIS_Main_Demo_CbFxn;
    
       /* Register callback with encoder */
       Venc_registerCallback(&callback,
                   (Ptr)&gRecStrm_ctrl);
    
       status = OSA_semCreate(&gRecStrm_ctrl.recStrmSem, 1, 0);
       OSA_assert(status==OSA_SOK);
    
       /* Create thread for handling encoder events and sending data 
          to recorder and streamer module */
       status = OSA_thrCreate(
                   &gRecStrm_ctrl.recStrmTdHnd,
                   AIS_Main_Demo_Main,
                   OSA_THR_PRI_DEFAULT,
                   0,
                   &gRecStrm_ctrl
                   );
    
       OSA_assert(status==OSA_SOK);
    
       /* Create thread for monitoring storage space availability */
       status = OSA_thrCreate(
                   &gRecStrm_ctrl.monitorThread,
                   AIS_Monitor_Thread,
                   OSA_THR_PRI_DEFAULT,
                   0,
                   &gRecStrm_ctrl
                   );
    
       OSA_assert(status==OSA_SOK);
    
       OSA_mutexUnlock (&gRecStrm_ctrl.recStrmMtxHnd);
    
       return VPU_OK;
    }
    
    /*******************************************************************************/
    /*! \fn     Int32 Strm_Rec_Delete()
     *
     *   \brief Stops and deletes recoder and streamer instances
    
     *   \param None
     *
     *   \return VPU_OK on success
     */
    /*******************************************************************************/
    Int32 Strm_Rec_Delete()
    {
       Int32 i;
    
        gRecStrm_ctrl.exitRecStrmTd = TRUE;
        OSA_semSignal(&gRecStrm_ctrl.recStrmSem);
    
       /* Wait till recoder is stopped completely */
       while(!gRecStrm_ctrl.isRecStrmTdStopDone)
        {
          printf ("Waiting for AIS_Main_Demo_Main thread to get stopped \n");
            OSA_waitMsecs(10);
        }
    
       /* If recoder module is initialized, the stop it */
       if (gRecStrm_ctrl.recInitDone == TRUE)
       {
          for (i = 0; i < MAX_REC_CH; i++)
          {
             /* If recorder instances are in running state */
             if (gRecStrm_ctrl.recStatus[i] != REC_STOPPED)
             {
                gRecStrm_ctrl.spsFound[i] = FALSE;
                gRecStrm_ctrl.fileWriteEnable[i] = FALSE;
                RecStop(&gRecStrm_ctrl.ctrlHandle[i]);
             }
          }
    
          /* Unintialize recoder */  
          RecDeInit();
          gRecStrm_ctrl.recInitDone = FALSE;
       }
    
       /* Uninitialize all the variables and put them in default state */
       for (i = 0; i< NO_OF_SESSIONS; i++)
       {
          gRecStrm_ctrl.strmStatus[i] = STRM_STOPPED;
    
          if (gRecStrm_ctrl.sessionHandle[i] != NULL)
          {
             RTSPSessionDestroy(gRecStrm_ctrl.sessionHandle[i]);
             gRecStrm_ctrl.sessionHandle[i] = NULL;
          }
       }
    
       /* Stop RTSP server */
       if (gRecStrm_ctrl.serverHandle != NULL)
       {
          RTSPServerShutdown(gRecStrm_ctrl.serverHandle); 
          gRecStrm_ctrl.serverHandle = NULL;
       }
    
        OSA_thrDelete(&gRecStrm_ctrl.recStrmTdHnd);
    
        OSA_semDelete(&gRecStrm_ctrl.recStrmSem);
    
        return VPU_OK;
    }
    
    /*******************************************************************************/
    /*! \fn     Void AIS_Main_Demo_start
     *
     *   \brief This is the entry function of AIS demo, where it loads last good/default
     *       configurations from file system, loads necessary configuration in global
     *       instance of demo. It also initializes all the modules accordingly and passed
     *       control to other function to perform VPU operations based on AP commands.
     *
     *   \param
     *   \param
     *
     *   \return  
     */
    /*******************************************************************************/
    Void AIS_Main_Demo_start ()
    {
       VSYS_PARAMS_S vsysParams;
       VDIS_PARAMS_S vdisParams;
       //UInt16  osdFormat[ALG_LINK_OSD_MAX_CH];
       //Int32 i;
    
       if(0 != access(VPU_SD_MOUNT_PT,F_OK|W_OK))
        {
          printf (VPU_SD_MOUNT_PT " Path does not exists \n");
          system ("mkdir -m 0755 "VPU_SD_MOUNT_PT);
        }
       
       if (0 != access (VPU_EXT_MOUNT_PT,F_OK|W_OK))
       {
          printf (VPU_EXT_MOUNT_PT " Path does not exists \n");
          system ("mkdir -m 0755 "VPU_EXT_MOUNT_PT);
       }
    
    
       /* Initialize all the RDK modules */
        Vsys_params_init(&vsysParams);
        Vdis_params_init(&vdisParams);
    
       /* initialize the parameters needed for our customized usecase */ 
       gDemo_info.maxVdisChannels = 1;
       gDemo_info.VsysNumChs      = 1;
    
       vdisParams.numChannels = 1; 
    
       ReadYUVInit();
    
       vsysParams.systemUseCase        = VSYS_USECASE_AIS_MAIN_DEMO;
       vsysParams.enableCapture        = FALSE;
       vsysParams.enableNsf            = FALSE;
       vsysParams.enableEncode         = FALSE;
       vsysParams.enableDecode         = FALSE;
       vsysParams.enableNullSrc        = FALSE;
       vsysParams.numDeis              = 0;
       vsysParams.deinterlaceFlag      = 0;
    
       vsysParams.numSwMs              = 0;
       vsysParams.numDisplays          = 1;
       vsysParams.enableSecondaryOut   = FALSE;
       vsysParams.enableMjpegEnc       = FALSE;
       vsysParams.enableSclr           = FALSE;
       vsysParams.enableOsd            = TRUE;
       vsysParams.enableScd            = FALSE;
       
       vsysParams.enableRunBuff        = FALSE;   
       /* TRUE: To enable saving 100 frames in DSP at  shared memory
          FALSE: To disable saving 100 frames in DSP at shared memory*/
    
       printf ("--------------- CHANNEL DETAILS-------------\n");
       printf ("Disp Channels => %d\n", vdisParams.numChannels);
       printf ("-------------------------------------------\n");    
    
       Vsys_init(&vsysParams);
       Vdis_init(&vdisParams);
    
       /* Configure display */
       Vsys_configureDisplay();
    
       /* Create Link instances and connect component blocks */
       Vsys_create();
    
       /* Register event handler */
       Vsys_registerEventHandler(Demo_eventHandler, NULL);
    
    
       if(vsysParams.enableRunBuff == TRUE )
       {
          if ( ERROR_NONE != Vsys_runbuf_init(&gRunBufInfo) )
          {
             OSA_printf(" %s:%s: ERROR : init running buffer failed\n",__FILE__,__func__);
          }
          else
          {
             OSA_printf("Enabled: Running buffer feature.\n");
          }
       }
       else
       {
          OSA_printf("Not Enabled: Running buffer feature.\n");
       }
    
    
       UpdateCameraName((U8)CHANNEL_A);
       
       
        /* Start components in reverse order */
        Vdis_start();
    
       AP_VPU_Switch_Display_Channel((Int32)DISP_CHNA);  
    
       /* Reset statistics */
       //AIS_Main_Demo_resetStats ();
    
       signal(SIGALRM, AIS_Monitor_Framerate);
       alarm(10);
    }
    
    /*******************************************************************************/
    /*! \fn     Void AIS_Main_Demo_stop ()
     *
     *   \brief Stops all the modules and uninitializes respective modules and variables
    
     *   \param None
     *
     *   \return Void
     */
    /*******************************************************************************/
    Void AIS_Main_Demo_stop ()
    {
        VSYS_PARAMS_S contextInf;
        Vsys_getContext(&contextInf);
    
       ReadYUVdeInit();
    
       /* Stop RDK modules */
        Vcap_stop();
        Venc_stop();
    
        Vdis_stop();
    
       if (contextInf.enableOsd == TRUE)
       {
          Demo_osdDeinit(); 
       }
    
       /* Free Shared memory, allocated for running buffer*/
       if(contextInf.enableRunBuff == TRUE )
       {
          if( ERROR_NONE != Vsys_runbuf_deInit(&gRunBufInfo))
          {
             OSA_printf(" %s:%s: ERROR : failed to free running buffer memory\n" ,__FILE__,__func__);
          }
       }
    
       
       /* Uninitialize and stop streamer and recoder module */
       Strm_Rec_Delete (); 
       OSA_mutexDelete (&gRecStrm_ctrl.recStrmMtxHnd);
    
        Vsys_delete();
    
        /* De-initialize components */
        Vcap_exit();
        Venc_exit();
        Vsys_exit();
    
       /* Uninitialize RS232 receiver */
        RS232_DeInit();
    }
    
    
    /*******************************************************************************/
    /*! \fn      I32 GetRecordingSinceTime(U32 *recSince)
     *  
     *   \brief   This provides recording since started.
        
     *   \param     *recSince - Recrding since started
     *                                                                              
     *   \return  I32, VPU_OK on success, otherwise error code will be returned(as in vpu_values.h)..
     */ 
    /*******************************************************************************/
    I32 GetRecordingSinceTime(U32 *recSince)
    {
       time_t curTime;
       I32 status = VPU_OK;
       
       if(-1 == time(&curTime))
       {
          perror("Failed to get time of day \n");   
          VPU_ERR_BIT_MASK(status, VPU_REC_START_TIME_GET_FAIL);
       }
    
       *recSince = (U32)curTime - (U32)gRecStrm_ctrl.startTime;
    
       return status;
    }
    
    
    /*******************************************************************************/
    /*! \fn      I32 GetRecordingStatus(U8 *recStatus)
     *  
     *   \brief   This provides recording since started.
        
     *   \param     *recStatus - Recrding status
     *                                                                              
     *   \return  I32, VPU_OK on success, otherwise error code will be returned(as in vpu_values.h)..
     */ 
    /*******************************************************************************/
    I32 GetRecordingStatus(U8 *recStatus)
    {
       I32 status = VPU_OK;
    
       if(gRecStrm_ctrl.recording_fail)
       {  
          *recStatus = VPU_RECORDING_FAIL;
       }
       else
       {  
          if(TRUE == gRecStrm_ctrl.enableFileWrite)
             *recStatus = VPU_RECORDING_ACTV;
          else
             *recStatus = VPU_RECORDING_NOT_ACTV;
       }
    
       return status;
    }
    
    
    /******************************************************************************
    * Copyright (c) 2013 eInfochips - All rights reserved.
    *
    * This software is authored by eInfochips and is eInfochips' Int32ellectual
    * property, including the copyrights in all countries in the world.
    * This software is provided under a license to use only with all other rights,
    * including ownership rights, being retained by eInfochips.
    *
    * This file may not be distributed, copied, or reproduced in any manner,
    * electronic or otherwise, without the written consent of eInfochips.
    *****************************************************************************/
    
    **********************************************************************
    **************Starting AIS VPU demo application ***************
             Version: 4.0, Date: Feb 21 2014, Time: 14:11:27          
    **********************************************************************
    Start ReadYUVInit()
    Start FramesRdSendFxn()
    End ReadYUVInit()
    --------------- CHANNEL DETAILS-------------
    Disp Channels => 1
    -------------------------------------------
     0: SYSTEM: System Common Init in progress !!!
     0: SYSTEM: IPC init in progress !!!
     21: SYSTEM: CPU [DSP] syslink proc ID is [0] !!!
     21: SYSTEM: CPU [VIDEO-M3] syslink proc ID is [1] !!!
     21: SYSTEM: CPU [VPSS-M3] syslink proc ID is [2] !!!
     21: SYSTEM: CPU [HOST] syslink proc ID is [3] !!!
     21: SYSTEM: Creating MsgQ Heap [IPC_MSGQ_MSG_HEAP_3] ...
     24: SYSTEM: Creating MsgQ [HOST_MSGQ] ...
     25: SYSTEM: Creating MsgQ [HOST_ACK_MSGQ] ...
     27: SYSTEM: Opening MsgQ [DSP_MSGQ] ...
     28: SYSTEM: Opening MsgQ [VIDEO-M3_MSGQ] ...
     28: SYSTEM: Opening MsgQ [VPSS-M3_MSGQ] ...
     29: SYSTEM: Notify register to [DSP] line 0, event 15 ... 
     30: SYSTEM: Notify register to [VIDEO-M3] line 0, event 15 ... 
     30: SYSTEM: Notify register to [VPSS-M3] line 0, event 15 ... 
     31: SYSTEM: IPC init DONE !!!
     32: SYSTEM: Creating ListMP [HOST_IPC_OUT_24] in region 0 ...
     34: SYSTEM: Creating ListMP [HOST_IPC_IN_24] in region 0 ...
     36: SYSTEM: ListElem Shared Addr = 0x41499d00
     37: SYSTEM: Creating ListMP [HOST_IPC_OUT_25] in region 0 ...
     39: SYSTEM: Creating ListMP [HOST_IPC_IN_25] in region 0 ...
     41: SYSTEM: ListElem Shared Addr = 0x414b7800
     42: SYSTEM: Creating ListMP [HOST_IPC_OUT_19] in region 0 ...
     44: SYSTEM: Creating ListMP [HOST_IPC_IN_19] in region 0 ...
     46: SYSTEM: ListElem Shared Addr = 0x414d5300
     47: SYSTEM: Creating ListMP [HOST_IPC_OUT_20] in region 0 ...
     49: SYSTEM: Creating ListMP [HOST_IPC_IN_20] in region 0 ...
     51: SYSTEM: ListElem Shared Addr = 0x414f8b80
     52: SYSTEM: Creating ListMP [HOST_IPC_OUT_21] in region 0 ...
     54: SYSTEM: Creating ListMP [HOST_IPC_IN_21] in region 0 ...
     56: SYSTEM: ListElem Shared Addr = 0x4151c400
     78: SYSTEM: System Common Init Done !!!
     144: MCFW  : CPU Revision [ES2.1] !!! 
     144: MCFW  : Detected [UNKNOWN] Board !!! 
     144: MCFW  : Base Board Revision [REV A] !!! 
    
     [host]  145: IPC_FRAMES_OUT   : Create in progress !!!
    
     [host]  147: IPC_FRAMES_OUT   : Create Done !!!
     [c6xdsp ]  97246: IPC_FRAMES_IN   : Create in progress !!!
     [m3vpss ]  99887: IPC_FRAMES_OUT   : Create in progress !!!
     [c6xdsp ]  97246: SYSTEM: Opening ListMP [DSP_IPC_OUT_22] ...
     [m3vpss ]  99889: SYSTEM: Opening MsgQ [HOST_MSGQ] ...
     [m3vpss ]  99892: IPC_FRAMES_OUT   : Create Done !!!
     [c6xdsp ]  98246: SYSTEM: Opening ListMP [DSP_IPC_OUT_22] ...
     [c6xdsp ]  99246: SYSTEM: Opening ListMP [DSP_IPC_OUT_22] ...
     [c6xdsp ]  100246: SYSTEM: Opening ListMP [DSP_IPC_OUT_22] ...
     [c6xdsp ]  101246: SYSTEM: Opening ListMP [DSP_IPC_OUT_22] ...
     [c6xdsp ]  102246: SYSTEM: Opening ListMP [DSP_IPC_OUT_22] ...
     [c6xdsp ]  103246: SYSTEM: Opening ListMP [DSP_IPC_OUT_22] ...
     [c6xdsp ]  104246: SYSTEM: Opening ListMP [DSP_IPC_OUT_22] ...
     [c6xdsp ]  105246: SYSTEM: Opening ListMP [DSP_IPC_OUT_22] ...
     [c6xdsp ]  106246: SYSTEM: Opening ListMP [DSP_IPC_OUT_22] ...
     [c6xdsp ]  107246: Assertion @ Line: 107 in links_common/system/system_ipc_listMP.c: 0 : failed !!!
    
    
    

  • You data flow is wrong. Link connection should be ipcFramesOutHost->ipcFramesInDSP->algLink.

    You cant connect ipcFramesOutHost to ipcFramesOutVpss

  • redirected the link as suggested, still getting the same error.

    attached the updated code.

    #include "multich_common.h"
    
     
    static SystemVideo_Ivahd2ChMap_Tbl systemVid_encDecIvaChMapTbl =
    {
        .isPopulated = 1,
        .ivaMap[0] =
        {
            .EncNumCh  = 2,
            .EncChList = {0, 2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 , 0, 0},
            .DecNumCh  = 0,
            .DecChList = {0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0},
        },
        .ivaMap[1] =
        {
            .EncNumCh  = 2,
            .EncChList = {1, 3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 , 0, 0},
            .DecNumCh  = 0,
            .DecChList = {0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0},
        },
        .ivaMap[2] =
        {
            .EncNumCh  = 8,
            .EncChList = {4, 5, 6, 7, 8, 9, 10, 11, 0, 0, 0, 0, 0, 0 , 0, 0},
            .DecNumCh  = 0,
            .DecChList = {0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0},
        },
    };
    
    
    #define MAX_DEC_OUT_FRAMES_PER_CH                 (5)
    #define FILE_SD_IPCFRAMEEXPORT_FRAME_WIDTH        (320)
    #define FILE_SD_IPCFRAMEEXPORT_FRAME_HEIGHT       (240)
    
    
    static Void SetIpcFramesOutInQueInfo(System_LinkQueInfo *inQueInfo)
    {
        Int i;
    
        inQueInfo->numCh = gVencModuleContext.vencConfig.numPrimaryChn;
        for (i = 0; i < inQueInfo->numCh; i++)
        {
            inQueInfo->chInfo[i].bufType    = SYSTEM_BUF_TYPE_VIDFRAME;
            inQueInfo->chInfo[i].dataFormat = SYSTEM_DF_YUV420SP_UV;
            inQueInfo->chInfo[i].memType    = SYSTEM_MT_NONTILEDMEM;
            inQueInfo->chInfo[i].scanFormat = SYSTEM_SF_PROGRESSIVE;
            inQueInfo->chInfo[i].startX     = 0;
            inQueInfo->chInfo[i].startY     = 0;
            inQueInfo->chInfo[i].width      = FILE_SD_IPCFRAMEEXPORT_FRAME_WIDTH;
            inQueInfo->chInfo[i].height     = FILE_SD_IPCFRAMEEXPORT_FRAME_HEIGHT;
            inQueInfo->chInfo[i].pitch[0]   = FILE_SD_IPCFRAMEEXPORT_FRAME_WIDTH;
            inQueInfo->chInfo[i].pitch[1]   = FILE_SD_IPCFRAMEEXPORT_FRAME_WIDTH;        
            inQueInfo->chInfo[i].pitch[2]   = 0;
        }
    }
    
    Void VPU_Create_AIS_Main_Demo()
    {
       IpcFramesOutLinkHLOS_CreateParams   ipcFramesOutHostPrm;
       //IpcFramesOutLinkRTOS_CreateParams   ipcFramesOutVpssPrm[2];
       IpcFramesInLinkRTOS_CreateParams    ipcFramesInDspPrm[2];
       AlgLink_CreateParams                dspAlgPrm[2];
       IpcLink_CreateParams                ipcOutVpssPrm;
       //DisplayLink_CreateParams            displayPrm[2];
       
       //Bool   enableOsdAlgLink           = gVsysModuleContext.vsysConfig.enableOsd;
       //Bool   enableTIVA_AlgLink         = gVsysModuleContext.vsysConfig.enableStab;
       //Bool   enableScdAlgLink           = gVsysModuleContext.vsysConfig.enableScd;
       UInt32 noOfDSPAlgo = 1;
       UInt32 i;
    
       MULTICH_INIT_STRUCT(IpcFramesOutLinkHLOS_CreateParams, ipcFramesOutHostPrm);
    
       for (i = 0; i < noOfDSPAlgo;i++)
       {   
          //MULTICH_INIT_STRUCT(IpcFramesOutLinkRTOS_CreateParams,ipcFramesOutVpssPrm[i]);
          MULTICH_INIT_STRUCT(IpcFramesInLinkRTOS_CreateParams,ipcFramesInDspPrm[i]);
          MULTICH_INIT_STRUCT(AlgLink_CreateParams, dspAlgPrm[i]);
       }
    
       MULTICH_INIT_STRUCT(IpcLink_CreateParams           ,ipcOutVpssPrm);
    /*
       for (i = 0; i < VDIS_DEV_MAX; i++)
       {
          MULTICH_INIT_STRUCT(DisplayLink_CreateParams,displayPrm[i]);
       }
    */    
       MultiCh_detectBoard();
    
       System_linkControl(SYSTEM_LINK_ID_M3VPSS,SYSTEM_M3VPSS_CMD_RESET_VIDEO_DEVICES,NULL,0,TRUE);
    
       System_linkControl(SYSTEM_LINK_ID_M3VIDEO,SYSTEM_COMMON_CMD_SET_CH2IVAHD_MAP_TBL,&systemVid_encDecIvaChMapTbl,
             sizeof(SystemVideo_Ivahd2ChMap_Tbl),TRUE);
       
       gVdisModuleContext.ipcFramesOutHostId     = SYSTEM_HOST_LINK_ID_IPC_FRAMES_OUT_0;
       //gVcapModuleContext.ipcFramesOutVpssId[0]  = SYSTEM_VPSS_LINK_ID_IPC_FRAMES_OUT_0;   
       gVcapModuleContext.ipcFramesInDspId[0]    = SYSTEM_DSP_LINK_ID_IPC_FRAMES_IN_0;
       gVcapModuleContext.dspAlgId[0]            = SYSTEM_LINK_ID_ALG_0  ;
       //gVdisModuleContext.displayId[2]           = SYSTEM_LINK_ID_DISPLAY_2; 
       
       // 1. ����IpcFramesOutLinkHLOS�IJ���
       ipcFramesOutHostPrm.baseCreateParams.noNotifyMode              = FALSE;
       ipcFramesOutHostPrm.baseCreateParams.notifyNextLink            = TRUE;
       ipcFramesOutHostPrm.baseCreateParams.notifyPrevLink            = FALSE;
       ipcFramesOutHostPrm.baseCreateParams.numOutQue                 = 1;
       ipcFramesOutHostPrm.baseCreateParams.outQueParams[0].nextLink  = gVcapModuleContext.ipcFramesInDspId[0];
       SetIpcFramesOutInQueInfo(&ipcFramesOutHostPrm.inQueInfo);
    
    /*   ipcFramesOutVpssPrm[0].baseCreateParams.inQueParams.prevLinkId       = gVdisModuleContext.ipcFramesOutHostId;
       ipcFramesOutVpssPrm[0].baseCreateParams.inQueParams.prevLinkQueId    = 0;
       ipcFramesOutVpssPrm[0].baseCreateParams.notifyPrevLink               = TRUE;
       
       ipcFramesOutVpssPrm[0].baseCreateParams.numOutQue                 = 1;
       ipcFramesOutVpssPrm[0].baseCreateParams.outQueParams[0].nextLink  = gVdisModuleContext.displayId[2];
       ipcFramesOutVpssPrm[0].baseCreateParams.notifyNextLink            = TRUE;
       
       ipcFramesOutVpssPrm[0].baseCreateParams.processLink               = gVcapModuleContext.ipcFramesInDspId[0];
       ipcFramesOutVpssPrm[0].baseCreateParams.notifyProcessLink         = TRUE;
       ipcFramesOutVpssPrm[0].baseCreateParams.noNotifyMode              = FALSE;
    */
    
       ipcFramesInDspPrm[0].baseCreateParams.inQueParams.prevLinkId      = gVcapModuleContext.ipcFramesInDspId[0];
       ipcFramesInDspPrm[0].baseCreateParams.inQueParams.prevLinkQueId   = 0;
       ipcFramesInDspPrm[0].baseCreateParams.numOutQue                   = 1;
       ipcFramesInDspPrm[0].baseCreateParams.outQueParams[0].nextLink    = gVcapModuleContext.dspAlgId[0];
       ipcFramesInDspPrm[0].baseCreateParams.notifyPrevLink              = TRUE;
       ipcFramesInDspPrm[0].baseCreateParams.notifyNextLink              = TRUE;
       ipcFramesInDspPrm[0].baseCreateParams.noNotifyMode                = FALSE;
       
       dspAlgPrm[0].inQueParams.prevLinkId       = gVcapModuleContext.ipcFramesInDspId[0];
       dspAlgPrm[0].inQueParams.prevLinkQueId    = 0;
       dspAlgPrm[0].enableRunBuff                = FALSE;
       dspAlgPrm[0].standard                     = SYSTEM_STD_D1;
    
       
       dspAlgPrm[0].tivaCreateParams.maxWidth          = ALG_LINK_TIVA_MAX_FRAME_WIDTH;
       dspAlgPrm[0].tivaCreateParams.maxHeight         = ALG_LINK_TIVA_MAX_FRAME_HEIGHT;
       dspAlgPrm[0].tivaCreateParams.maxStride         = ALG_LINK_TIVA_MAX_FRAME_WIDTH;
       dspAlgPrm[0].tivaCreateParams.useStab           = 0;
       dspAlgPrm[0].tivaCreateParams.stabProcessLevel  = ALG_LINK_TIVA_PROCESS_LEVEL;
       dspAlgPrm[0].tivaCreateParams.motionModelType   = 4;//ALG_LINK_MMODEL_AFFINE;
       dspAlgPrm[0].tivaCreateParams.vs_ver_crop_size  = 0;
       dspAlgPrm[0].tivaCreateParams.vs_hor_crop_size  = 0;
       
       dspAlgPrm[0].tivaCreateParams.useTraker = TRUE;
       dspAlgPrm[0].tivaCreateParams.tRoiX     = 200;
       dspAlgPrm[0].tivaCreateParams.tRoiY     = 200;
       dspAlgPrm[0].tivaCreateParams.tRoiW     = 20;
       dspAlgPrm[0].tivaCreateParams.tRoiH     = 20;
    /*
       displayPrm[2].numInputQueues                 = 1;
       displayPrm[2].activeQueue                    = 0;
       displayPrm[2].inQueParams[0].prevLinkId      = gVcapModuleContext.ipcFramesOutVpssId[0];
       displayPrm[2].inQueParams[0].prevLinkQueId   = 0;
       displayPrm[2].displayRes                     = VSYS_STD_NTSC;
    */   
    
       //if(enableOsdAlgLink || enableScdAlgLink || enableTIVA_AlgLink)
       {
       //printf("enableOsdAlgLink : %d\n", gVsysModuleContext.vsysConfig.enableOsd);
          for(i=0;i<noOfDSPAlgo;i++)
          {
             System_linkCreate(gVdisModuleContext.ipcFramesOutHostId, &ipcFramesOutHostPrm, sizeof(ipcFramesOutHostPrm));
             //System_linkCreate(gVcapModuleContext.ipcFramesOutVpssId[i], &ipcFramesOutVpssPrm[i], sizeof(ipcFramesOutVpssPrm[i]));
             System_linkCreate(gVcapModuleContext.ipcFramesInDspId[i], &ipcFramesInDspPrm[i], sizeof(ipcFramesInDspPrm[i]));
             System_linkCreate(gVcapModuleContext.dspAlgId[i] , &dspAlgPrm[i], sizeof(dspAlgPrm[i]));
          }
       }
       
       //System_linkCreate(gVdisModuleContext.displayId[2], &displayPrm[2], sizeof(displayPrm[2]));
    
       MultiCh_memPrintHeapStatus();    
    }
    
    Void VPU_Delete_AIS_Main_Demo()
    {
       Bool   enableOsdAlgLink = gVsysModuleContext.vsysConfig.enableOsd;
       Bool   enableTIVA_AlgLink = gVsysModuleContext.vsysConfig.enableStab;
       Bool   enableScdAlgLink = gVsysModuleContext.vsysConfig.enableScd;
       UInt32 noOfDSPAlgo = 1;
       Int32 i;
    
       gVdisModuleContext.displayId[2] = SYSTEM_LINK_ID_DISPLAY_2; // OFF CHIP HDMI
    
       if(gVsysModuleContext.vsysConfig.enableOsd)
       {
          System_linkDelete(gVcapModuleContext.dspAlgId[0]);
       }
    
       System_linkDelete(gVdisModuleContext.ipcFramesOutHostId);
    
       if(enableOsdAlgLink || enableScdAlgLink || enableTIVA_AlgLink)
       {
          for(i=0;i<noOfDSPAlgo;i++)
          {
             System_linkDelete(gVcapModuleContext.ipcFramesOutVpssId[i] );
             System_linkDelete(gVcapModuleContext.ipcFramesInDspId[i]);
          }
       }
    
       System_linkDelete(gVdisModuleContext.displayId[2]);
    
       /* Print the HWI, SWI and all tasks load */
       /* Reset the accumulated timer ticks */
       MultiCh_prfLoadCalcEnable(FALSE, TRUE, FALSE);
    }
    
    
    
    
    

  • This code is wrong:

      ipcFramesInDspPrm[0].baseCreateParams.inQueParams.prevLinkId      = gVcapModuleContext.ipcFramesInDspId[0];
    
    It should be
    
      ipcFramesInDspPrm[0].baseCreateParams.inQueParams.prevLinkId      = gVdisModuleContext.ipcFramesOutHostId;

  • Hi Badri,

    I modified the code, now am not getting any runtime errors as earlier. Its just code is hanging at,

    OSA_semWait(&gReadYUVConfig.thrStartSem,OSA_TIMEOUT_FOREVER);

    in void *FramesRdSendFxn(Void * prm)

    PFA for log file and source code,

    #include "multich_common.h"
    
     
    static SystemVideo_Ivahd2ChMap_Tbl systemVid_encDecIvaChMapTbl =
    {
        .isPopulated = 1,
        .ivaMap[0] =
        {
            .EncNumCh  = 2,
            .EncChList = {0, 2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 , 0, 0},
            .DecNumCh  = 0,
            .DecChList = {0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0},
        },
        .ivaMap[1] =
        {
            .EncNumCh  = 2,
            .EncChList = {1, 3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 , 0, 0},
            .DecNumCh  = 0,
            .DecChList = {0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0},
        },
        .ivaMap[2] =
        {
            .EncNumCh  = 8,
            .EncChList = {4, 5, 6, 7, 8, 9, 10, 11, 0, 0, 0, 0, 0, 0 , 0, 0},
            .DecNumCh  = 0,
            .DecChList = {0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0},
        },
    };
    
    
    #define MAX_DEC_OUT_FRAMES_PER_CH                 (5)
    #define FILE_SD_IPCFRAMEEXPORT_FRAME_WIDTH        (320)
    #define FILE_SD_IPCFRAMEEXPORT_FRAME_HEIGHT       (240)
    
    
    static Void SetIpcFramesOutInQueInfo(System_LinkQueInfo *inQueInfo)
    {
        Int i;
    
        inQueInfo->numCh = gVencModuleContext.vencConfig.numPrimaryChn;
        for (i = 0; i < inQueInfo->numCh; i++)
        {
            inQueInfo->chInfo[i].bufType    = SYSTEM_BUF_TYPE_VIDFRAME;
            inQueInfo->chInfo[i].dataFormat = SYSTEM_DF_YUV420SP_UV;
            inQueInfo->chInfo[i].memType    = SYSTEM_MT_NONTILEDMEM;
            inQueInfo->chInfo[i].scanFormat = SYSTEM_SF_PROGRESSIVE;
            inQueInfo->chInfo[i].startX     = 0;
            inQueInfo->chInfo[i].startY     = 0;
            inQueInfo->chInfo[i].width      = FILE_SD_IPCFRAMEEXPORT_FRAME_WIDTH;
            inQueInfo->chInfo[i].height     = FILE_SD_IPCFRAMEEXPORT_FRAME_HEIGHT;
            inQueInfo->chInfo[i].pitch[0]   = FILE_SD_IPCFRAMEEXPORT_FRAME_WIDTH;
            inQueInfo->chInfo[i].pitch[1]   = FILE_SD_IPCFRAMEEXPORT_FRAME_WIDTH;        
            inQueInfo->chInfo[i].pitch[2]   = 0;
        }
    }
    
    Void VPU_Create_AIS_Main_Demo()
    {
       IpcFramesOutLinkHLOS_CreateParams   ipcFramesOutHostPrm;
       //IpcFramesOutLinkRTOS_CreateParams   ipcFramesOutVpssPrm[2];
       IpcFramesInLinkRTOS_CreateParams    ipcFramesInDspPrm[2];
       AlgLink_CreateParams                dspAlgPrm[2];
       IpcLink_CreateParams                ipcOutVpssPrm;
       //DisplayLink_CreateParams            displayPrm[2];
       
       //Bool   enableOsdAlgLink           = gVsysModuleContext.vsysConfig.enableOsd;
       //Bool   enableTIVA_AlgLink         = gVsysModuleContext.vsysConfig.enableStab;
       //Bool   enableScdAlgLink           = gVsysModuleContext.vsysConfig.enableScd;
       UInt32 noOfDSPAlgo = 1;
       UInt32 i;
    
       MULTICH_INIT_STRUCT(IpcFramesOutLinkHLOS_CreateParams, ipcFramesOutHostPrm);
    
       for (i = 0; i < noOfDSPAlgo;i++)
       {   
          //MULTICH_INIT_STRUCT(IpcFramesOutLinkRTOS_CreateParams,ipcFramesOutVpssPrm[i]);
          MULTICH_INIT_STRUCT(IpcFramesInLinkRTOS_CreateParams,ipcFramesInDspPrm[i]);
          MULTICH_INIT_STRUCT(AlgLink_CreateParams, dspAlgPrm[i]);
       }
    
       MULTICH_INIT_STRUCT(IpcLink_CreateParams           ,ipcOutVpssPrm);
    /*
       for (i = 0; i < VDIS_DEV_MAX; i++)
       {
          MULTICH_INIT_STRUCT(DisplayLink_CreateParams,displayPrm[i]);
       }
    */    
       MultiCh_detectBoard();
    
       System_linkControl(SYSTEM_LINK_ID_M3VPSS,SYSTEM_M3VPSS_CMD_RESET_VIDEO_DEVICES,NULL,0,TRUE);
    
       System_linkControl(SYSTEM_LINK_ID_M3VIDEO,SYSTEM_COMMON_CMD_SET_CH2IVAHD_MAP_TBL,&systemVid_encDecIvaChMapTbl,
             sizeof(SystemVideo_Ivahd2ChMap_Tbl),TRUE);
       
       gVdisModuleContext.ipcFramesOutHostId     = SYSTEM_HOST_LINK_ID_IPC_FRAMES_OUT_0;
       //gVcapModuleContext.ipcFramesOutVpssId[0]  = SYSTEM_VPSS_LINK_ID_IPC_FRAMES_OUT_0;   
       gVcapModuleContext.ipcFramesInDspId[0]    = SYSTEM_DSP_LINK_ID_IPC_FRAMES_IN_0;
       gVcapModuleContext.dspAlgId[0]            = SYSTEM_LINK_ID_ALG_0  ;
       //gVdisModuleContext.displayId[2]           = SYSTEM_LINK_ID_DISPLAY_2; 
       
       // 1. ����IpcFramesOutLinkHLOS�IJ���
       ipcFramesOutHostPrm.baseCreateParams.noNotifyMode              = FALSE;
       ipcFramesOutHostPrm.baseCreateParams.notifyNextLink            = TRUE;
       ipcFramesOutHostPrm.baseCreateParams.notifyPrevLink            = FALSE;
       ipcFramesOutHostPrm.baseCreateParams.numOutQue                 = 1;
       ipcFramesOutHostPrm.baseCreateParams.outQueParams[0].nextLink  = gVcapModuleContext.ipcFramesInDspId[0];
       SetIpcFramesOutInQueInfo(&ipcFramesOutHostPrm.inQueInfo);
    
    /*   ipcFramesOutVpssPrm[0].baseCreateParams.inQueParams.prevLinkId       = gVdisModuleContext.ipcFramesOutHostId;
       ipcFramesOutVpssPrm[0].baseCreateParams.inQueParams.prevLinkQueId    = 0;
       ipcFramesOutVpssPrm[0].baseCreateParams.notifyPrevLink               = TRUE;
       
       ipcFramesOutVpssPrm[0].baseCreateParams.numOutQue                 = 1;
       ipcFramesOutVpssPrm[0].baseCreateParams.outQueParams[0].nextLink  = gVdisModuleContext.displayId[2];
       ipcFramesOutVpssPrm[0].baseCreateParams.notifyNextLink            = TRUE;
       
       ipcFramesOutVpssPrm[0].baseCreateParams.processLink               = gVcapModuleContext.ipcFramesInDspId[0];
       ipcFramesOutVpssPrm[0].baseCreateParams.notifyProcessLink         = TRUE;
       ipcFramesOutVpssPrm[0].baseCreateParams.noNotifyMode              = FALSE;
    */
       ipcFramesInDspPrm[0].baseCreateParams.inQueParams.prevLinkId      = gVdisModuleContext.ipcFramesOutHostId;
       ipcFramesInDspPrm[0].baseCreateParams.inQueParams.prevLinkQueId   = 0;
       ipcFramesInDspPrm[0].baseCreateParams.numOutQue                   = 1;
       ipcFramesInDspPrm[0].baseCreateParams.outQueParams[0].nextLink    = gVcapModuleContext.dspAlgId[0];
       ipcFramesInDspPrm[0].baseCreateParams.notifyPrevLink              = TRUE;
       ipcFramesInDspPrm[0].baseCreateParams.notifyNextLink              = TRUE;
       ipcFramesInDspPrm[0].baseCreateParams.noNotifyMode                = FALSE;
       
       dspAlgPrm[0].inQueParams.prevLinkId       = gVcapModuleContext.ipcFramesInDspId[0];
       dspAlgPrm[0].inQueParams.prevLinkQueId    = 0;
       dspAlgPrm[0].enableRunBuff                = FALSE;
       dspAlgPrm[0].standard                     = SYSTEM_STD_D1;
    
       
       dspAlgPrm[0].tivaCreateParams.maxWidth          = ALG_LINK_TIVA_MAX_FRAME_WIDTH;
       dspAlgPrm[0].tivaCreateParams.maxHeight         = ALG_LINK_TIVA_MAX_FRAME_HEIGHT;
       dspAlgPrm[0].tivaCreateParams.maxStride         = ALG_LINK_TIVA_MAX_FRAME_WIDTH;
       dspAlgPrm[0].tivaCreateParams.useStab           = 0;
       dspAlgPrm[0].tivaCreateParams.stabProcessLevel  = ALG_LINK_TIVA_PROCESS_LEVEL;
       dspAlgPrm[0].tivaCreateParams.motionModelType   = 4;//ALG_LINK_MMODEL_AFFINE;
       dspAlgPrm[0].tivaCreateParams.vs_ver_crop_size  = 0;
       dspAlgPrm[0].tivaCreateParams.vs_hor_crop_size  = 0;
       
       dspAlgPrm[0].tivaCreateParams.useTraker = TRUE;
       dspAlgPrm[0].tivaCreateParams.tRoiX     = 200;
       dspAlgPrm[0].tivaCreateParams.tRoiY     = 200;
       dspAlgPrm[0].tivaCreateParams.tRoiW     = 20;
       dspAlgPrm[0].tivaCreateParams.tRoiH     = 20;
    /*
       displayPrm[2].numInputQueues                 = 1;
       displayPrm[2].activeQueue                    = 0;
       displayPrm[2].inQueParams[0].prevLinkId      = gVcapModuleContext.ipcFramesOutVpssId[0];
       displayPrm[2].inQueParams[0].prevLinkQueId   = 0;
       displayPrm[2].displayRes                     = VSYS_STD_NTSC;
    */   
    
       //if(enableOsdAlgLink || enableScdAlgLink || enableTIVA_AlgLink)
       {
       //printf("enableOsdAlgLink : %d\n", gVsysModuleContext.vsysConfig.enableOsd);
          for(i=0;i<noOfDSPAlgo;i++)
          {
             System_linkCreate(gVdisModuleContext.ipcFramesOutHostId, &ipcFramesOutHostPrm, sizeof(ipcFramesOutHostPrm));
             //System_linkCreate(gVcapModuleContext.ipcFramesOutVpssId[i], &ipcFramesOutVpssPrm[i], sizeof(ipcFramesOutVpssPrm[i]));
             System_linkCreate(gVcapModuleContext.ipcFramesInDspId[i], &ipcFramesInDspPrm[i], sizeof(ipcFramesInDspPrm[i]));
             System_linkCreate(gVcapModuleContext.dspAlgId[i] , &dspAlgPrm[i], sizeof(dspAlgPrm[i]));
          }
       }
       
       //System_linkCreate(gVdisModuleContext.displayId[2], &displayPrm[2], sizeof(displayPrm[2]));
    
       MultiCh_memPrintHeapStatus();    
    }
    
    Void VPU_Delete_AIS_Main_Demo()
    {
       Bool   enableOsdAlgLink = gVsysModuleContext.vsysConfig.enableOsd;
       Bool   enableTIVA_AlgLink = gVsysModuleContext.vsysConfig.enableStab;
       Bool   enableScdAlgLink = gVsysModuleContext.vsysConfig.enableScd;
       UInt32 noOfDSPAlgo = 1;
       Int32 i;
    
       gVdisModuleContext.displayId[2] = SYSTEM_LINK_ID_DISPLAY_2; // OFF CHIP HDMI
    
       if(gVsysModuleContext.vsysConfig.enableOsd)
       {
          System_linkDelete(gVcapModuleContext.dspAlgId[0]);
       }
    
       System_linkDelete(gVdisModuleContext.ipcFramesOutHostId);
    
       if(enableOsdAlgLink || enableScdAlgLink || enableTIVA_AlgLink)
       {
          for(i=0;i<noOfDSPAlgo;i++)
          {
             System_linkDelete(gVcapModuleContext.ipcFramesOutVpssId[i] );
             System_linkDelete(gVcapModuleContext.ipcFramesInDspId[i]);
          }
       }
    
       System_linkDelete(gVdisModuleContext.displayId[2]);
    
       /* Print the HWI, SWI and all tasks load */
       /* Reset the accumulated timer ticks */
       MultiCh_prfLoadCalcEnable(FALSE, TRUE, FALSE);
    }
    
    
    
    
    

    #include <vpu_demo.h>
    #include "demo_scd_bits_wr.h"
    #include "vpu_get_files.h"
    #include "vpu_status_report.h"
    
    #include "vpu_streamer_cfg.h"
    #include "vpu_utils.h"
    #include "demo_running_buff.h"
    #include <signal.h>
    
    #include <mcfw/interfaces/link_api/vidframe.h>
    //#include <mcfw/interfaces/link_api/ipcLink.h>
    
    #include "ais_fileRd.h"
    
    extern Int32 IpcFramesOutLink_putFullVideoFrames(UInt32 linkId,
                                              VIDFrame_BufList *bufList);
    extern Int32 IpcFramesOutLink_getEmptyVideoFrames(UInt32 linkId,
                                               VIDFrame_BufList *bufList);
    
    
    #define FILERD_TRACE_ENABLE_FXN_ENTRY_EXIT           (0)
    #define FILERD_TRACE_INFO_PRINT_INTERVAL             (8192)
    
    
    #if FILERD_TRACE_ENABLE_FXN_ENTRY_EXIT
    #define FILERD_TRACE_FXN(str,...)                    do {                           \
                                                         static Int printInterval = 0;\
                                                         if ((printInterval % FILERD_TRACE_INFO_PRINT_INTERVAL) == 0) \
                                                         {                                                          \
                                                             OSA_printf("TI_FILERD:%s function:%s",str,__func__);     \
                                                             OSA_printf(__VA_ARGS__);                               \
                                                         }                                                          \
                                                         printInterval++;                                           \
                                                       } while (0)
    #define FILERD_TRACE_FXN_ENTRY(...)                  FILERD_TRACE_FXN("Entered",__VA_ARGS__)
    #define FILERD_TRACE_FXN_EXIT(...)                   FILERD_TRACE_FXN("Leaving",__VA_ARGS__)
    #else
    #define FILERD_TRACE_FXN_ENTRY(...)
    #define FILERD_TRACE_FXN_EXIT(...)
    #endif
    
    ///////////FILE IO//////////////
    
    int ENC_CH = 1;
    AIS_fileRD_IpcFramesBufObj  frmObj[8];
    ReadYUV_Config  gReadYUVConfig;
    
    static Void fileRd_copyVidFrameInfoLink2McFw(VIDEO_FRAMEBUF_S *dstBuf,
                                               VIDFrame_Buf    *srcBuf)
    {
        Int i,j;
        OSA_assert(VIDEO_MAX_FIELDS == VIDFRAME_MAX_FIELDS);
        OSA_assert(VIDEO_MAX_PLANES == VIDFRAME_MAX_PLANES);
    
        for (i = 0; i < VIDEO_MAX_FIELDS; i++)
        {
            for (j = 0; j < VIDEO_MAX_PLANES; j++)
            {
                dstBuf->addr[i][j] = srcBuf->addr[i][j];
                dstBuf->phyAddr[i][j] = srcBuf->phyAddr[i][j];
            }
        }
        dstBuf->channelNum  = srcBuf->channelNum;
        dstBuf->fid         = srcBuf->fid;
        dstBuf->frameWidth  = srcBuf->frameWidth;
        dstBuf->frameHeight = srcBuf->frameHeight;
        dstBuf->linkPrivate = srcBuf->linkPrivate;
        dstBuf->timeStamp   = srcBuf->timeStamp;
        dstBuf->framePitch[0] = srcBuf->framePitch[0];
        dstBuf->framePitch[1] = srcBuf->framePitch[1];
    
        FILERD_TRACE_FXN_EXIT("VidFrameInfo:"
                             "virt[0][0]:%p,"
                             "phy[0][0]:%p,"
                             "channelNum:%d,"
                             "fid:%d,"
                             "frameWidth:%d,"
                             "frameHeight:%d,"
                             "framePitch[0]:%d,"
                             "framePitch[1]:%d,"
                             "timeStamp:%d,",
                             dstBuf->addr[0][0],
                             dstBuf->phyAddr[0][0],
                             dstBuf->channelNum,
                             dstBuf->fid,
                             dstBuf->frameWidth,
                             dstBuf->frameHeight,
                             dstBuf->framePitch[0],
                             dstBuf->framePitch[1],
                             dstBuf->timeStamp);
    }
    
    
    Int32 fileRd_getEmptyVideoFrames(VIDEO_FRAMEBUF_LIST_S *pFrameBufList, UInt32 timeout)
    {
        VIDFrame_BufList  vidBufList;
        VIDFrame_Buf     *pInBuf;
        VIDEO_FRAMEBUF_S *pOutBuf;
        UInt32 i;
    
        FILERD_TRACE_FXN_ENTRY();
        pFrameBufList->numFrames = 0;
        vidBufList.numFrames = 0;
        IpcFramesOutLink_getEmptyVideoFrames(SYSTEM_HOST_LINK_ID_IPC_FRAMES_OUT_0,
                                             &vidBufList);
    
        pFrameBufList->numFrames = vidBufList.numFrames;
        for (i = 0; i < vidBufList.numFrames; i++)
        {
            pOutBuf = &pFrameBufList->frames[i];
            pInBuf = &vidBufList.frames[i];
    
            fileRd_copyVidFrameInfoLink2McFw(pOutBuf,pInBuf);
        }
    
        FILERD_TRACE_FXN_EXIT("NumFrames Received:%d",pFrameBufList->numFrames);
        return 0;
    }
    
    
    static Void fileRd_copyVidFrameInfoMcFw2Link(VIDFrame_Buf *dstBuf,
                                               VIDEO_FRAMEBUF_S    *srcBuf)
    {
        Int i,j;
        OSA_assert(VIDEO_MAX_FIELDS == VIDFRAME_MAX_FIELDS);
        OSA_assert(VIDEO_MAX_PLANES == VIDFRAME_MAX_PLANES);
    
        for (i = 0; i < VIDEO_MAX_FIELDS; i++)
        {
            for (j = 0; j < VIDEO_MAX_PLANES; j++)
            {
                dstBuf->addr[i][j] = srcBuf->addr[i][j];
                dstBuf->phyAddr[i][j] = srcBuf->phyAddr[i][j];
            }
        }
        dstBuf->channelNum  = srcBuf->channelNum;
        dstBuf->fid         = srcBuf->fid;
        dstBuf->frameWidth  = srcBuf->frameWidth;
        dstBuf->frameHeight = srcBuf->frameHeight;
        dstBuf->linkPrivate = srcBuf->linkPrivate;
        dstBuf->timeStamp   = srcBuf->timeStamp;
        dstBuf->framePitch[0] = srcBuf->framePitch[0];
        dstBuf->framePitch[1] = srcBuf->framePitch[1];
    
        FILERD_TRACE_FXN_EXIT("VidFrameInfo:"
                             "virt[0][0]:%p,"
                             "phy[0][0]:%p,"
                             "channelNum:%d,"
                             "fid:%d,"
                             "frameWidth:%d,"
                             "frameHeight:%d,"
                             "framePitch[0]:%d,"
                             "framePitch[1]:%d,"
                             "timeStamp:%d,",
                             dstBuf->addr[0][0],
                             dstBuf->phyAddr[0][0],
                             dstBuf->channelNum,
                             dstBuf->fid,
                             dstBuf->frameWidth,
                             dstBuf->frameHeight,
                             dstBuf->framePitch[0],
                             dstBuf->framePitch[1],
                             dstBuf->timeStamp);
    }
    
    /**
        \brief Send filled video gBuffers to framework for algo to process
    
    */
    Int32 fileRd_putFullVideoFrames(VIDEO_FRAMEBUF_LIST_S *pFrameBufList)
    {
        VIDEO_FRAMEBUF_S *pSrcBuf;
        VIDFrame_Buf     *pDstBuf;
        VIDFrame_BufList  vidBufList;
        UInt32 i;
        Int status = 0;
    
        FILERD_TRACE_FXN_ENTRY("Num bufs put:%d",pFrameBufList->numFrames);
        vidBufList.numFrames = pFrameBufList->numFrames;
        for (i = 0; i < vidBufList.numFrames; i++)
        {
            pSrcBuf = &pFrameBufList->frames[i];
            pDstBuf = &vidBufList.frames[i];
            fileRd_copyVidFrameInfoMcFw2Link(pDstBuf,pSrcBuf);
        }
        if (vidBufList.numFrames)
        {       
    #if 0
            status =
            IpcFramesOutLink_putFullVideoFrames(gVdisModuleContext.ipcFramesOutHostId,
                                                &vidBufList);
    #else
       status =
       IpcFramesOutLink_putFullVideoFrames(SYSTEM_HOST_LINK_ID_IPC_FRAMES_OUT_0,
                                           &vidBufList);
    #endif
        }
        FILERD_TRACE_FXN_ENTRY("VIDFrame release status:%d",status);
        return 0;
    }
    
    static Void IpcFramesFillBufInfo(VIDEO_FRAMEBUF_LIST_S *bufList,
                                     AIS_fileRD_IpcFramesBufObj  frameObj[],
                                     UInt32  numFrames)//frameObj num
    {
        Int i,j;
       //UInt64 curTimeStamp = 0;
        bufList->numFrames = 0;
       for(i = 0; i < numFrames; i++)
       {
          if (frameObj[i].refCnt == 0)
            {   
             for(j=0; j<frameObj[i].numCh; j++)
             {
                // Send the same buffer to all enabled channels
                bufList->frames[bufList->numFrames].addr[0][0] =  frameObj[i].bufVirt;
                bufList->frames[bufList->numFrames].phyAddr[0][0] = (Ptr)frameObj[i].bufPhy;
                
                bufList->frames[bufList->numFrames].addr[0][1] =  frameObj[i].bufVirt + frameObj[i].curWidth * frameObj[i].curHeight;
                bufList->frames[bufList->numFrames].phyAddr[0][1] = (Ptr)frameObj[i].bufPhy + frameObj[i].curWidth * frameObj[i].curHeight;
                
                bufList->frames[bufList->numFrames].frameWidth =  frameObj[i].curWidth;
                bufList->frames[bufList->numFrames].frameHeight =  frameObj[i].curHeight;
                bufList->frames[bufList->numFrames].fid = VIDEO_FID_TYPE_FRAME;
                bufList->frames[bufList->numFrames].channelNum = i;
                bufList->frames[bufList->numFrames].framePitch[0] = frameObj[i].curWidth;
                bufList->frames[bufList->numFrames].framePitch[1] = frameObj[i].curWidth;
                bufList->frames[bufList->numFrames].framePitch[2] = 0;
                bufList->frames[bufList->numFrames].timeStamp  = 0;
                bufList->numFrames++;
             }
             frameObj[i].refCnt = 1;
            }   
       }
       
    }
    
    
    static Void  IpcFramesInitFrame(AIS_fileRD_IpcFramesBufObj *frameObj,
                             Vsys_AllocBufInfo        *bufInfo,
                             UInt32                   maxWidth,
                             UInt32                   maxHeight)
    {
        frameObj->bufPhy  = (UInt32)bufInfo->physAddr;
        frameObj->bufVirt = bufInfo->virtAddr;
        frameObj->maxWidth = maxWidth;
        frameObj->maxHeight = maxHeight;
        frameObj->curWidth = maxWidth;
        frameObj->curHeight = maxHeight;
        frameObj->numCh     = 1;
        //frameObj->numCh     = ENC_CH;
        frameObj->refCnt    = 0;
    }
    
    static Void IpcFramesInitFrameObj(AIS_fileRD_IpcFramesBufObj frameObj[],UInt32 numFrames)
    {
        Int i;
        Vsys_AllocBufInfo bufInfo;
        Int32 status;
        UInt32 frameSize;
        UInt32 minChWidth = FRWIDTH ;
        UInt32 minChHeight = FRHEIGHT;
       
       
        frameSize = MCFW_IPCBITS_GET_YUV420_FRAME_SIZE(minChWidth,minChHeight);
        memset(frameObj,0,sizeof(AIS_fileRD_IpcFramesBufObj) * numFrames);
        for (i = 0; i < numFrames;i++)
        {
            status = Vsys_allocBuf(MCFW_IPCFRAMES_SRID,frameSize,MCFW_IPCFRAMES_FRAME_BUF_ALIGN,&bufInfo);
            if (ERROR_NONE == status)
            {
                IpcFramesInitFrame(&frameObj[i],&bufInfo,minChWidth, minChHeight);
            }
        }
    }
    
    static Void  IpcFramesDeInitFrameObj(AIS_fileRD_IpcFramesBufObj frameObj[],
                                         UInt32 numFrames)
    {
        Int i;
        Int status;
    
        for (i = 0; i < numFrames;i++)
        {
            if (frameObj[i].bufVirt)
            {
                status =
                Vsys_freeBuf(MCFW_IPCFRAMES_SRID,
                             frameObj[i].bufVirt,
                             MCFW_IPCBITS_GET_YUV420_FRAME_SIZE(frameObj[i].maxWidth,
                                                                frameObj[i].maxHeight));
                OSA_assert(status ==0);
            }
        }
    }
    
    static Void   IpcFramesFreeFrameBuf(VIDEO_FRAMEBUF_LIST_S *bufList,
                               AIS_fileRD_IpcFramesBufObj      frameObj[],
                               UInt32  numFrames)
    {
        Int i,j;
    
        for (i = 0; i < bufList->numFrames; i++)
        {
            for (j = 0; j < numFrames; j++)
            {
                if (frameObj[j].bufPhy ==(UInt32)bufList->frames[i].phyAddr[0][0])
                {
                OSA_assert(frameObj[j].refCnt > 0); 
                    frameObj[j].refCnt--;
                }    
            }
        }
    }
    
    
    void *FramesRdSendFxn(Void * prm)
    {
       VIDEO_FRAMEBUF_LIST_S bufList;
       Int status;
       int i = 0;
    
       printf("Start FramesRdSendFxn()\n");
    
       OSA_semWait(&gReadYUVConfig.thrStartSem,OSA_TIMEOUT_FOREVER);
    
       printf("OSA_semWait() \n");
       
       IpcFramesInitFrameObj(frmObj, ENC_CH);//ENC_CH
    
       printf("FramesRdSendFxn() - Test 1\n");
       
       while (FALSE == gReadYUVConfig.thrExit)
       {
          printf("FramesRdSendFxn() - Test 2\n");
    
          IpcFramesFillBufInfo(&bufList,frmObj, ENC_CH);//ENC_CH
          printf("FramesRdSendFxn() - Test 3\n");
    
          if(bufList.numFrames)
          {
              //printf("0rfcnt is %d\n", frmObj[1].refCnt);
              printf("Start File read\n");
              
             for(i=0; i<ENC_CH; i++)
             {
                fread((unsigned char *)(bufList.frames[i].addr[0][0]), 1, gReadYUVConfig.width * gReadYUVConfig.height, gReadYUVConfig.fin[i]);
                fread((unsigned char *)(bufList.frames[i].addr[0][1]), 1, gReadYUVConfig.width * gReadYUVConfig.height * 0.5, gReadYUVConfig.fin[i]);
                if(feof(gReadYUVConfig.fin[i]))
                {
                   fseek(gReadYUVConfig.fin[i], 0, SEEK_SET);
                }
             }
             printf("feed buflist to encode\n");
             status = fileRd_putFullVideoFrames(&bufList);
                OSA_assert(0 == status);
          }
          else
              printf("fill buffer info failed\n");
              
    
          OSA_waitMsecs(33);
            
          status =  fileRd_getEmptyVideoFrames(&bufList,0);
          OSA_assert(0 == status);
          if (bufList.numFrames)
          {
             IpcFramesFreeFrameBuf(&bufList,frmObj, ENC_CH);//ENC_CH
             //printf("1rfcnt is %d\n", frmObj[0].refCnt);
          }
           else
             printf("get empty video frames failed\n");
           
       }
    
       printf("End FramesRdSendFxn()\n");
    
       return NULL;   
    }
    
    
    char *infilename[8] = {"/home/root/egtest01_640x480.yuv", "2sp.yuv", "3sp.yuv", "3sp.yuv", "2sp.yuv", "1sp.yuv", "1sp.yuv", "2sp.yuv"};
    
    Void ReadYUVInit()
    {
       int status;
       int i;
    
       printf("Start ReadYUVInit()\n");   
       
       memset(&gReadYUVConfig,0,sizeof(gReadYUVConfig));
       for(i=0; i<ENC_CH; i++)
       {
          gReadYUVConfig.fin[i] = fopen(infilename[i],"rb");
          if(gReadYUVConfig.fin[i] == NULL)
          {
             printf("Can't open yuv file %d\n", i);
          }
       }
       gReadYUVConfig.fileNum = ENC_CH;
       gReadYUVConfig.height = FRHEIGHT;
       gReadYUVConfig.width = FRWIDTH;
    
       gReadYUVConfig.thrExit = FALSE;
    
       status = OSA_semCreate(&gReadYUVConfig.thrStartSem,10,0);
       OSA_assert(status==OSA_SOK);
    
       status = OSA_thrCreate(&gReadYUVConfig.thrHandle,
                FramesRdSendFxn,
                MCFW_IPCFRAMES_SENDFXN_TSK_PRI,
                MCFW_IPCFRAMES_SENDFXN_TSK_STACK_SIZE,
                &gReadYUVConfig);
    
       OSA_assert(status==OSA_SOK);
    
       printf("End ReadYUVInit()\n");   
        
       return ;
    }
    
    Void ReadYUVdeInit()
    {   
       gReadYUVConfig.thrExit = TRUE;
       OSA_waitMsecs(1000);
       if(gReadYUVConfig.fin != NULL)
          fclose(gReadYUVConfig.fin[0]);
    
       IpcFramesDeInitFrameObj(frmObj, 0);
    
       return ;
    }
    
    ///////////FILE IO end//////////////
    
    

    /******************************************************************************
    * Copyright (c) 2013 eInfochips - All rights reserved.
    *
    * This software is authored by eInfochips and is eInfochips' intellectual
    * property, including the copyrights in all countries in the world.
    * This software is provided under a license to use only with all other rights,
    * including ownership rights, being retained by eInfochips.
    *
    * This file may not be distributed, copied, or reproduced in any manner,
    * electronic or otherwise, without the written consent of eInfochips.
    *****************************************************************************/
    
    /******************************************************************************/
    /*! \file   ais_main_demo.c
     *
     *  \brief  This file has functional APIs for initialization, deinitialization and
     *       processing thread \n
     *                             \n
     *          SCD is disabled explicitly because TI's SCD algorithm works with  CIF only,
     *          however this application supports D1 and HD.
     *          Refer SCD demo in order to integrate AIS's SCD algorithm  in this application.
     *
     *
     * History: \n
     *          MAR/13/2013, eInfochips,   Created the file \n
     *          APR/08/2013, eInfochips,   Started adding support of meta data with recording. \n
     *          APR/16/2013, eInfochips,   Added support of recording based on camera switching. \n
     *          APR/21/2013, eInfochips,   Added support of keeping some storage as buffer space,
     *                                     due to this recording will not be stoped unexpectedly.\n
     *          MAY/02/2013, eInfochips,   Solved issue of restarting recording after enough storage is available. \n
     *          MAY/16/2013, eInfochips,   Added restriction of recording, it will only done on SD card. \n
     *          \n
     */
    /******************************************************************************/
    #include <vpu_demo.h>
    #include "demo_scd_bits_wr.h"
    #include "vpu_get_files.h"
    #include "vpu_status_report.h"
    
    #include "vpu_streamer_cfg.h"
    #include "vpu_utils.h"
    #include "demo_running_buff.h"
    #include <signal.h>
    
    #include "ais_fileRd.h"
    
    /*! \var gRecStrm_ctrl 
        \brief  Global structure containing recorder streamer and other configurations. 
    */
    VrecVstrm_Ctrl gRecStrm_ctrl;
    
    UInt16 frameStreamCount;
    UInt16 prevFrameCount;
    UInt16 framerate = 0;
    
    /*! \var gRunBufInfo
        \brief A global variable for running buffer support.
    */
    Running_Buffer_Info gRunBufInfo = {
        .pPhysicalAddr  = NULL,
        .pVirtualAddr   = NULL,
        .totalSize      = 0
    };
    
    /*******************************************************************************/
    /*! \fn     Int32 StreamingFPS (U8 *fps)
     *
     *   \brief This function get VPU camera capture FPS.
     *
     *   \param  *fps - writes FPS of current streaming camera.
     *
     * \return  I32, VPU_OK on success, otherwise error code will be returned(as in vpu_values.h)..
     *   
     */
    /*******************************************************************************/
    Int32 StreamingFPS (U8 *fps)
    {
       if(fps != NULL)
          *fps = (U8)framerate;
       
       return 0;
    }
    
    
    /*******************************************************************************/
    /*! \fn     Int32 AIS_Main_Demo_resetStats
     *
     *   \brief Resets statistics of demo
    
     *   \param None
     *
     *   \return VPU_OK
     */
    /*******************************************************************************/
    Int32 AIS_Main_Demo_resetStats()
    {
        UInt32 chId;
        VcapVenc_ChInfo *pChInfo;
    
        for(chId=0; chId<VENC_CHN_MAX; chId++)
        {
            pChInfo = &gRecStrm_ctrl.chInfo[chId];
    
            pChInfo->totalDataSize = 0;
            pChInfo->numKeyFrames = 0;
            pChInfo->numFrames = 0;
            pChInfo->maxWidth = 0;
            pChInfo->minWidth = 0xFFF;
            pChInfo->maxHeight= 0;
            pChInfo->minHeight= 0xFFF;
            pChInfo->maxLatency= 0;
            pChInfo->minLatency= 0xFFF;
    
        }
    
        gRecStrm_ctrl.statsStartTime = OSA_getCurTimeInMsec();
        return VPU_OK;
    }
    
    /*******************************************************************************/
    /*! \fn     Void *AIS_Monitor_Thread (Void *pPrm)
     *
     *   \brief This is the thread to monitor the available storage space.
     *
     *   \param Void *pPrm: Unused
     *
     *   \return NULL when the thread is exited 
     */
    /*******************************************************************************/
    Void *AIS_Monitor_Thread (Void *pPrm)
    {
       I32   status = VPU_OK;
       U64 spaceLeft = 0;
       Bool restartRecIfSpacAvail = FALSE;
       UINT32 tmpMinSizeInBytes = 10485760;  /* 10MB */
       I32 recStoped = 0;
    
       printf ("AIS_Monitor_Thread created\n");
    
       while (!gRecStrm_ctrl.exitRecStrmTd)
       {
          /* Get free space from SD card */
    
          if ( (VPU_DOWNLOAD_RUNNING == RecordingDownloadStatus(-1)) || 
              (VPU_CLEAR_RUNNING == RecordingClearStatus(-1)) || 
              ((gRecStrm_ctrl.autoRecFlag != VPU_REC_AUTO_START) && 
                (gRecStrm_ctrl.prevRecCmdCfg.recording != VPU_START_REC) ) )
          {
             sleep(1);
             continue;
          }
       
          status = GetFreeSpace(VPU_SD_MOUNT_PT,&spaceLeft);
          if (status != VPU_OK)
          {
             printf ("Error [%d] in getting free space at '%s' location \n", status,
                      VPU_SD_MOUNT_PT);
    
             restartRecIfSpacAvail = FALSE;
             if(!recStoped)
             {
                if (gRecStrm_ctrl.recordTelemetryOnly == TRUE)
                {
                   AP_VPU_RecordTelemetryOnly (FALSE);
                   gRecStrm_ctrl.recordTelemetryOnly = TRUE;
                }
                else
                {
                    /* If configured for both channel recording,then stop both channels */
                   if (gRecStrm_ctrl.recordTelemetryOnly == FALSE)
                   {
                      AP_VPU_StartStopRecording (RECR_A, FALSE);
                        AP_VPU_StartStopRecording (RECR_B, FALSE);
                   }                                
                    /* If configured for selected channel recording,then stop selected channel */
                    else
                   {
                      /* If selected channel is B, then stop recording of B */
                        if (gRecStrm_ctrl.activeChn == CHANNEL_B)
                        {
                           AP_VPU_StartStopRecording (RECR_B, FALSE);
                       }
                       /* If selected channel is A, then stop recording of A */
                      else
                       {       
                            AP_VPU_StartStopRecording (RECR_A, FALSE);
                        }            
                    }
                }
                recStoped = 1;    
             }
               printf ("[VPU INFO] SD card not mounted or inserted, Recording can not be started without SD storage\n");
              sleep(3);
             continue;
          }
          else
            {
             if(recStoped)
                 restartRecIfSpacAvail = TRUE;
             recStoped = 0;
          }
    
          /* If enough space is not available for storage of MPEG2TS file */
          if (spaceLeft <= tmpMinSizeInBytes)
          {
             printf ("Free space %llu : Margin: %d \n", spaceLeft, tmpMinSizeInBytes);
             restartRecIfSpacAvail = TRUE; 
    
             /* If video_overflow setting is to stop recording if storage space is full */
             if (gRecStrm_ctrl.prevRecStrmCfg.vid_ovrflw == VPU_STOP_RECORDING)
             {
                printf ("==> Stop recording until space is available .. \n");
    
                if (gRecStrm_ctrl.recordTelemetryOnly == TRUE)
                    {
                        AP_VPU_RecordTelemetryOnly (FALSE);
                   // Keep this preserved because whenever space is available start record telemetry only
                   gRecStrm_ctrl.recordTelemetryOnly = TRUE;
                    }
                else
                {
                   /* If configured for both channel recording,then stop both channels */
                   if (gRecStrm_ctrl.selectedRec == VPU_REC_BOTH)
                   {
                      AP_VPU_StartStopRecording (RECR_A, FALSE);
                      AP_VPU_StartStopRecording (RECR_B, FALSE);
                   }
                   /* If configured for selected channel recording,then stop selected channel */
                   else
                   {
                      /* If selected channel is B, then stop recording of B */
                      if (gRecStrm_ctrl.activeChn == CHANNEL_B)
                      {
                            AP_VPU_StartStopRecording (RECR_B, FALSE);
                      }
                      /* If selected channel is A, then stop recording of A */
                      else
                      {
                             AP_VPU_StartStopRecording (RECR_A, FALSE);
                      }
                   }
                }
             }
             /* If video_overflow setting is to overwrite oldest recording if storage space is full */
             else
             {
                printf ("==> Overwrite files until space is available .. \n");
    
                /* Stop existing recording */
                if (gRecStrm_ctrl.recordTelemetryOnly == TRUE)
                    {
                        AP_VPU_RecordTelemetryOnly (FALSE);
                    }
                else
                {
                   /* If configured for both channel recording,then stop both channels */
                   if (gRecStrm_ctrl.selectedRec == VPU_REC_BOTH)
                     {
                           AP_VPU_StartStopRecording (RECR_A, FALSE);
                          AP_VPU_StartStopRecording (RECR_B, FALSE);
                     }
                   /* If configured for selected channel recording,then stop selected channel */
                      else
                     {
                      /* If selected channel is B, then stop recording of B */
                          if (gRecStrm_ctrl.activeChn == CHANNEL_B)
                         {
                               AP_VPU_StartStopRecording (RECR_B, FALSE);
                          }
                      /* If selected channel is A, then stop recording of A */
                           else
                          {
                                AP_VPU_StartStopRecording (RECR_A, FALSE);
                          }
                       }
                }  
    
                /* Remove oldest file */
                if (DeleteOldestFile (VPU_SD_MOUNT_PT) != VPU_OK)
                {
                   printf ("Error in getting oldest file name \n");
                }
                else
                {
                   /* Since oldest file is deleted, restart recording immediately */
                   continue;
                }
             }
          }
          /* If enough space is available for storage of MPEG2TS file */
          else
          {
             /* If space is available after the storage space was full */
             if (restartRecIfSpacAvail == TRUE)
             {
                printf ("Space is now available so restarting file writing \n");
    
                    if( TRUE == gRecStrm_ctrl.autoRecFlag )
                    {
                        gRecStrm_ctrl.prevRecCmdCfg.recording = VPU_START_REC;
                    }
    
                if (gRecStrm_ctrl.recordTelemetryOnly == TRUE)
                    {
                        AP_VPU_RecordTelemetryOnly (TRUE);
                    }
                else
                {
                   /* If both channel recording was enabled, then start both the channels */
                   if (gRecStrm_ctrl.selectedRec == VPU_REC_BOTH)
                   {
                      AP_VPU_StartStopRecording (RECR_A, TRUE);
                      AP_VPU_StartStopRecording (RECR_B, TRUE);
                   }
                   /* If selected channel recording was enabled, then start selected channel */
                   else
                   {
                      /* If selected channel is B, then start recording of B */
                      if (gRecStrm_ctrl.activeChn == CHANNEL_B)
                      {
                         AP_VPU_StartStopRecording (RECR_B, TRUE);
                      }
                      /* If selected channel is A, then start recording of A */
                      else
                      {
                         AP_VPU_StartStopRecording (RECR_A, TRUE);
                      }
                   }
                }
    
                restartRecIfSpacAvail = FALSE;
             }
          }
          sleep (2);
       }
    
       printf ("Exiting '%s' thread \n", __FUNCTION__);
       return NULL;
    }
    
    /*******************************************************************************/
    /*! \fn     Void AIS_Main_Demo_CbFxn
     *
     *   \brief Registers call back for encoder
    
     *   \param None 
     *
     *   \return None
     */
    /*******************************************************************************/
    Void AIS_Main_Demo_CbFxn()
    {
        OSA_semSignal(&gRecStrm_ctrl.recStrmSem);
    }
    
    VcapVenc_ChInfo *pChInfo;
    
    void AIS_Monitor_Framerate(int signum)
    {
    
       if(prevFrameCount > frameStreamCount)
       {
          framerate = frameStreamCount+65536 - prevFrameCount;
       }
       else
          framerate = frameStreamCount - prevFrameCount;
       
       framerate /= 10;  
       //printf("FrameRate %d\n",framerate);
       prevFrameCount = frameStreamCount;
    
       signal(SIGALRM, AIS_Monitor_Framerate);
       alarm(10);
       return ;
    }
    
    /*******************************************************************************/
    /*! \fn     Void *AIS_Main_Demo_Main(Void *pPrm)
     *
     *   \brief This is the main thread for application, which gets encoded data from
     *       encoder module and sends the frames to recorder and streamer module
    
     *   \param Void *pPrm: Unused
     *
     *   \return NULL when the thread is exited 
     */
    /*******************************************************************************/
    Void *AIS_Main_Demo_Main(Void *pPrm)
    {
        Int32 status, frameId;
        VCODEC_BITSBUF_LIST_S bitsBuf;
        VCODEC_BITSBUF_S *pBuf = NULL;
        VcapVenc_ChInfo *pChInfo;
        UInt32 latency;
       Int32 retVal;
       Int32 tmpChId;
    
       gRecStrm_ctrl.isRecStrmTdStopDone = FALSE;
    
       while(!gRecStrm_ctrl.exitRecStrmTd)
       {
          /* Wait for encoder event to ensure that encoded data is ready */
          status = OSA_semWait(&gRecStrm_ctrl.recStrmSem, OSA_TIMEOUT_FOREVER);
          if(status!=OSA_SOK)
          {
             break;   
          }
    
          /* Get the encoder buffer */  
          status = Venc_getBitstreamBuffer(&bitsBuf, TIMEOUT_NO_WAIT);
    
          if(status==ERROR_NONE && bitsBuf.numBufs)
          {
             for(frameId=0; frameId<bitsBuf.numBufs; frameId++)
             {
                pBuf = &bitsBuf.bitsBuf[frameId];
                if(pBuf->chnId<VENC_CHN_MAX)
                {
                   pChInfo = &gRecStrm_ctrl.chInfo[pBuf->chnId];
    
                   pChInfo->totalDataSize += pBuf->filledBufSize;
                   pChInfo->numFrames++;
                   if(pBuf->frameType==VCODEC_FRAME_TYPE_I_FRAME)
                   {
                      pChInfo->numKeyFrames++;
                   }
    
                   latency = pBuf->encodeTimestamp - pBuf->timestamp;
    
                   if(latency > pChInfo->maxLatency)
                      pChInfo->maxLatency = latency;
    
                   if(latency < pChInfo->minLatency)
                      pChInfo->minLatency = latency;
    
                   if(pBuf->frameWidth > pChInfo->maxWidth)
                      pChInfo->maxWidth = pBuf->frameWidth;
    
                   if(pBuf->frameWidth < pChInfo->minWidth)
                      pChInfo->minWidth = pBuf->frameWidth;
    
                   if(pBuf->frameHeight > pChInfo->maxHeight)
                      pChInfo->maxHeight = pBuf->frameHeight;
    
                   if(pBuf->frameHeight < pChInfo->minHeight)
                      pChInfo->minHeight = pBuf->frameHeight;
                }
    
                /**************************************************
                   RECORDER
                ***************************************************/
                /* If recording is enabled */
                    if (gRecStrm_ctrl.enableFileWrite == TRUE)
                    {
                   /* Based on current encoded event, 
                      identify for which instance it is meant */
                        if (pBuf->chnId == REC_CHNA)
                        {
                            tmpChId = RECR_A;
                        }
                        else if (pBuf->chnId == REC_CHNB)
                        {
                            tmpChId = RECR_B;
                        }
                        else
                        {
                            tmpChId = -1;
                        }
    
                        if (tmpChId != -1)
                        {
                      /* If recording for the particular channel is selected, 
                         then send data to recorder module */
                            if ((gRecStrm_ctrl.fileWriteEnable [tmpChId] == TRUE) || 
                         (gRecStrm_ctrl.recordTelemetryOnly == TRUE))
                            {
                                retVal = SendToRecorder (&gRecStrm_ctrl, pBuf);
                                if (retVal != VPU_OK)
                                {
                            /* Ignore the error and move ahead */
                                }
                            }
                        }
                    }
    
                /****************************************
                   STREAMER
                *****************************************/
                /* If streaming is enabled */
    
                if ((gRecStrm_ctrl.enableStreaming == TRUE) || (gRecStrm_ctrl.hdChnStrm == TRUE))
                {
                   /* If streaming for the particular channel is selected,
                      then send data to streamer module */
                   if (gRecStrm_ctrl.streamEnable[pBuf->chnId] == TRUE)
                   {
                      if( (pBuf->chnId == CHANNEL_B) || (pBuf->chnId == CHANNEL_A) ) 
                      {
                         frameStreamCount++;
                      }
                      retVal = SendToStreamer (&gRecStrm_ctrl, pBuf);
                      if (retVal != VPU_OK)
                      {
                         /* Ignore the error and move ahead */
                      }
                   }
                }
                
             }
             /* Release the encoder buffer */
             Venc_releaseBitstreamBuffer(&bitsBuf);
          }
       }
       gRecStrm_ctrl.isRecStrmTdStopDone = TRUE;
    
       return NULL;
    }
    
    /*******************************************************************************/
    /*! \fn     Void InitDefaults ()
     *
     *   \brief Updates the global handle with necessary parameters for recorder and 
     *       streamer module. This is done to preserve previous configuration, whenever
     *       new configuration comes, it will be compared with older value, if both
     *       are not same, then only perform operation for recorder/streamer.  
     *
     *   \param None
     *
     *   \return Void
     */
    /*******************************************************************************/
    Void InitDefaults ()
    {
       OSA_mutexLock (&gRecStrm_ctrl.recStrmMtxHnd);
       usleep (1000);
    
       gRecStrm_ctrl.deinterlaceFlag = gRecStrm_ctrl.vpuCfgs.trackerCfg.deinterlace; 
       gRecStrm_ctrl.prevTrackerCfg.deinterlace = gRecStrm_ctrl.vpuCfgs.trackerCfg.deinterlace;
    
       memcpy (&(gRecStrm_ctrl.prevRecStrmCfg), &(gRecStrm_ctrl.vpuCfgs.recStrmCfg),          
             sizeof (VPU_RecordStreamConfig));   
    
       if (gRecStrm_ctrl.prevRecStrmCfg.auto_rec == VPU_REC_AUTO_START)
       {
          gRecStrm_ctrl.prevRecCmdCfg.recording = VPU_START_REC; 
       }
       else
       {
          gRecStrm_ctrl.prevRecCmdCfg.recording = VPU_UNKNOWN_START_STOP_REC; 
       }
    
       gRecStrm_ctrl.prevRecCmdCfg.recTelemetry_only = gRecStrm_ctrl.vpuCfgs.recCmdCfg.recTelemetry_only;
    
       usleep (1000);
       memcpy (&(gRecStrm_ctrl.prevSwcCam), &(gRecStrm_ctrl.vpuCfgs.switchCameraCmdCfg), 
             sizeof (VPU_SwitchCamera));   
    
        gRecStrm_ctrl.autoRecFlag = gRecStrm_ctrl.prevRecStrmCfg.auto_rec;
        gRecStrm_ctrl.selectedRec = gRecStrm_ctrl.prevRecStrmCfg.smlt_rec;
    
        gRecStrm_ctrl.autoStrmFlag = (gRecStrm_ctrl.prevRecStrmCfg.auto_stream 
                               == VPU_STRM_AUTO_START)? TRUE: FALSE;
    
       gRecStrm_ctrl.overwriteFlag = (gRecStrm_ctrl.prevRecStrmCfg.vid_ovrflw 
                               == VPU_OVER_WRITE)? TRUE: FALSE;
    
       gRecStrm_ctrl.recordTelemetryOnly = (gRecStrm_ctrl.prevRecCmdCfg.recTelemetry_only 
                               == VPU_REC_TEL_ONLY)? TRUE: FALSE;
    
       gRecStrm_ctrl.activeChn = (gRecStrm_ctrl.prevSwcCam.camera 
                               == VPU_CAMERA_A)? CHANNEL_A: CHANNEL_B;
    
       
       gRecStrm_ctrl.prevStrmCmdCfg.streaming = VPU_UNKNOWN_START_STOP_STREAM;
       gRecStrm_ctrl.prevStrmCmdCfg.hd_streaming = VPU_UNKNOWN_START_STOP_STREAM;
    
       gRecStrm_ctrl.vpuCfgs.strmCmdCfg.streaming = gRecStrm_ctrl.prevStrmCmdCfg.streaming;
       gRecStrm_ctrl.vpuCfgs.strmCmdCfg.hd_streaming = gRecStrm_ctrl.prevStrmCmdCfg.hd_streaming;
    
       gRecStrm_ctrl.vpuCfgs.recCmdCfg.recording = gRecStrm_ctrl.prevRecCmdCfg.recording;
       gRecStrm_ctrl.vpuCfgs.recCmdCfg.recTelemetry_only = gRecStrm_ctrl.prevRecCmdCfg.recTelemetry_only;
     
       usleep (1000);
       OSA_mutexUnlock (&gRecStrm_ctrl.recStrmMtxHnd);
    }
    
    /*******************************************************************************/
    /*! \fn     Int32 Strm_Rec_Create()
     *
     *   \brief Creates recoder(s) and streamer(s) based on default/latest configuration
     *       saved by VPU
     *
     *   \param None
     *
     *   \return RET_ERROR, if error. VPU_OK, if successful
     */
    /*******************************************************************************/
    Int32 Strm_Rec_Create()
    {
       VENC_CALLBACK_S callback;
       Int32 status, errorCode, i;
    
       /* Get IP address of eth0 interface, this is needed for creating RTP/RTSP session */
       if (GetIP ("eth0", (char*)gRecStrm_ctrl.serverIpAddr) < 0)
       {
          printf ("Error in getting IP address of Server \n");
          return RET_ERROR;
       }
    
       OSA_mutexLock (&gRecStrm_ctrl.recStrmMtxHnd);
    
    
       /* If Selected channel is B, then start streaming for camera B */
       if (gRecStrm_ctrl.activeChn == CHANNEL_B)
       {
            gRecStrm_ctrl.chnToStrm = STRM_CHNB;
       }
       /* If Selected channel is A, then start streaming for camera A */
        else if (gRecStrm_ctrl.activeChn == CHANNEL_A)
       {
            gRecStrm_ctrl.chnToStrm = STRM_CHNA;
       }
       else
       {
          printf ("Error : Invalid value of Active channel is received \n");
       }
       
       printf ("====> gRecStrm_ctrl.chnToStrm: %d \n", gRecStrm_ctrl.chnToStrm);
    
       /* If Auto recording is configured */  
       if (gRecStrm_ctrl.autoRecFlag == VPU_REC_AUTO_START)
       {
          /* Get the channel for recording based on selected primary channel */
          if (gRecStrm_ctrl.selectedRec == VPU_REC_SELECTED)
          {
             gRecStrm_ctrl.chnToRec = (gRecStrm_ctrl.activeChn == CHANNEL_B)?
                                              REC_CHNB:REC_CHNA;   
             printf ("Starting with 'Selected Automatic Recording'\n"); 
          }
          /* Both channel recording is enabled */
          else 
          {
             gRecStrm_ctrl.chnToRec = BOTH_RECR;
             gRecStrm_ctrl.fileWriteEnable[RECR_A] = TRUE;
             gRecStrm_ctrl.fileWriteEnable[RECR_B] = TRUE;
             printf ("Starting with 'Simultaneously Automatic Recording' \n"); 
          }
          gRecStrm_ctrl.enableFileWrite = TRUE;
       }
       /* Recording should be started based on user start/stop command */
       else 
       {
          gRecStrm_ctrl.enableFileWrite = FALSE; 
          printf ("Starting with 'No Recording' and recording will be started on user command \n"); 
          /* Check and update channels to be recorded based on selected channel or both channel configuration */
          /* Actual recording will be applicable on this configuration, whenever start command is sent by AP */
          if (gRecStrm_ctrl.selectedRec == VPU_REC_SELECTED)
          {
             gRecStrm_ctrl.chnToRec = (gRecStrm_ctrl.activeChn == CHANNEL_B)?REC_CHNB:REC_CHNA;
          }
          else 
          {
             gRecStrm_ctrl.chnToRec = BOTH_RECR;
          }  
       }
    
    
    
       /* If recording is not started yet, then start recording now */   
       if (gRecStrm_ctrl.recInitDone == FALSE)
       {
          OSA_mutexUnlock (&gRecStrm_ctrl.recStrmMtxHnd);
          /* Start recorder module, which will start MPEG2TS container */
          status = RecInit ();
          OSA_assert(status==OSA_SOK);
    
          OSA_mutexLock (&gRecStrm_ctrl.recStrmMtxHnd);
    
          gRecStrm_ctrl.recInitDone = TRUE;
       }
    
       printf (" ========> gRecStrm_ctrl.chnToRec : %d \n", gRecStrm_ctrl.chnToRec);
    
       /* Create RTP/RTSP session strings for UAV for cameraA and cameraB, UAVH (optional HD streaming) */
    
       // Name of CameraA and CameraB is kept same in order to switch them on same URL
       sprintf ((char*)((gRecStrm_ctrl.sessionName)[UAV_SESSION]), "%s", SESSION_NAME);
    
       /* HD streaming of camera B*/
       sprintf ((char*)((gRecStrm_ctrl.sessionName)[UAV_HD_SESSION]), "%s%c", SESSION_NAME, 'H');
    
       gRecStrm_ctrl.serverPort = DFT_RTSP_SRV_PORT;
       gRecStrm_ctrl.hdChnStrm = FALSE;
       gRecStrm_ctrl.exitRecStrmTd = FALSE;
       gRecStrm_ctrl.isRecStrmTdStopDone = TRUE; 
    
       printf ("Creating server with IP:%s, Port: %d \n", gRecStrm_ctrl.serverIpAddr, gRecStrm_ctrl.serverPort);
    
       /* Call API to start the RTSP Server */
       gRecStrm_ctrl.serverHandle = RTSPServerStart (gRecStrm_ctrl.serverIpAddr, gRecStrm_ctrl.serverPort, 0, &errorCode);
       if (gRecStrm_ctrl.serverHandle == NULL)
       {
          printf ("ERROR: Can not start the RTSP server \n");
          OSA_mutexUnlock (&gRecStrm_ctrl.recStrmMtxHnd);
          return RET_ERROR;
       }
       errorCode = 0;
       printf ("Server Started with handle '%x' \n", (Int32) gRecStrm_ctrl.serverHandle);
    
       /* If auto streaming is started, then enable streaming */
       if (gRecStrm_ctrl.autoStrmFlag == TRUE)
       {
          gRecStrm_ctrl.enableStreaming = TRUE;
       }
       /* don't start streaming, as automatic streaming is not enabled */
       else
       {
          gRecStrm_ctrl.enableStreaming = FALSE;
       }
    
       /* If SD streaming needs to be started, then create RTSP session */
       if (gRecStrm_ctrl.enableStreaming == TRUE)
       {
          i = UAV_SESSION;
          printf ("Creating session with session name : %s \n", ((char*)(gRecStrm_ctrl.sessionName[i])));
    
          /* Call API to create RTSP session in Server */
          gRecStrm_ctrl.sessionHandle[i] = RTSPSessionCreate (gRecStrm_ctrl.serverHandle,
             ((BYTE *)(gRecStrm_ctrl.sessionName[i])), (BYTE *)NULL, (SERVER_CALLBACK_FUNC)NULL,
             0, 0, USER_NAME, PASSWORD, 0, &errorCode);
          if (gRecStrm_ctrl.sessionHandle[i] == NULL)
          {
             printf ("ERROR: Can not create the RTSP session \n");
             OSA_mutexUnlock (&gRecStrm_ctrl.recStrmMtxHnd);
             return RET_ERROR;
          }
          errorCode = 0;
    
          printf ("Session '%s' created with handle '%x'\n", 
             ((char*)(gRecStrm_ctrl.sessionName)[i]), (Int32)gRecStrm_ctrl.sessionHandle[i]);
    
          gRecStrm_ctrl.streamAdded[i] = FALSE;
          gRecStrm_ctrl.strmStatus[i] = STRM_INIT;
          gRecStrm_ctrl.streamEnable[gRecStrm_ctrl.chnToStrm] = TRUE;
       }
    
       callback.newDataAvailableCb = AIS_Main_Demo_CbFxn;
    
       /* Register callback with encoder */
       Venc_registerCallback(&callback,
                   (Ptr)&gRecStrm_ctrl);
    
       status = OSA_semCreate(&gRecStrm_ctrl.recStrmSem, 1, 0);
       OSA_assert(status==OSA_SOK);
    
       /* Create thread for handling encoder events and sending data 
          to recorder and streamer module */
       status = OSA_thrCreate(
                   &gRecStrm_ctrl.recStrmTdHnd,
                   AIS_Main_Demo_Main,
                   OSA_THR_PRI_DEFAULT,
                   0,
                   &gRecStrm_ctrl
                   );
    
       OSA_assert(status==OSA_SOK);
    
       /* Create thread for monitoring storage space availability */
       status = OSA_thrCreate(
                   &gRecStrm_ctrl.monitorThread,
                   AIS_Monitor_Thread,
                   OSA_THR_PRI_DEFAULT,
                   0,
                   &gRecStrm_ctrl
                   );
    
       OSA_assert(status==OSA_SOK);
    
       OSA_mutexUnlock (&gRecStrm_ctrl.recStrmMtxHnd);
    
       return VPU_OK;
    }
    
    /*******************************************************************************/
    /*! \fn     Int32 Strm_Rec_Delete()
     *
     *   \brief Stops and deletes recoder and streamer instances
    
     *   \param None
     *
     *   \return VPU_OK on success
     */
    /*******************************************************************************/
    Int32 Strm_Rec_Delete()
    {
       Int32 i;
    
        gRecStrm_ctrl.exitRecStrmTd = TRUE;
        OSA_semSignal(&gRecStrm_ctrl.recStrmSem);
    
       /* Wait till recoder is stopped completely */
       while(!gRecStrm_ctrl.isRecStrmTdStopDone)
        {
          printf ("Waiting for AIS_Main_Demo_Main thread to get stopped \n");
            OSA_waitMsecs(10);
        }
    
       /* If recoder module is initialized, the stop it */
       if (gRecStrm_ctrl.recInitDone == TRUE)
       {
          for (i = 0; i < MAX_REC_CH; i++)
          {
             /* If recorder instances are in running state */
             if (gRecStrm_ctrl.recStatus[i] != REC_STOPPED)
             {
                gRecStrm_ctrl.spsFound[i] = FALSE;
                gRecStrm_ctrl.fileWriteEnable[i] = FALSE;
                RecStop(&gRecStrm_ctrl.ctrlHandle[i]);
             }
          }
    
          /* Unintialize recoder */  
          RecDeInit();
          gRecStrm_ctrl.recInitDone = FALSE;
       }
    
       /* Uninitialize all the variables and put them in default state */
       for (i = 0; i< NO_OF_SESSIONS; i++)
       {
          gRecStrm_ctrl.strmStatus[i] = STRM_STOPPED;
    
          if (gRecStrm_ctrl.sessionHandle[i] != NULL)
          {
             RTSPSessionDestroy(gRecStrm_ctrl.sessionHandle[i]);
             gRecStrm_ctrl.sessionHandle[i] = NULL;
          }
       }
    
       /* Stop RTSP server */
       if (gRecStrm_ctrl.serverHandle != NULL)
       {
          RTSPServerShutdown(gRecStrm_ctrl.serverHandle); 
          gRecStrm_ctrl.serverHandle = NULL;
       }
    
        OSA_thrDelete(&gRecStrm_ctrl.recStrmTdHnd);
    
        OSA_semDelete(&gRecStrm_ctrl.recStrmSem);
    
        return VPU_OK;
    }
    
    /*******************************************************************************/
    /*! \fn     Void AIS_Main_Demo_start
     *
     *   \brief This is the entry function of AIS demo, where it loads last good/default
     *       configurations from file system, loads necessary configuration in global
     *       instance of demo. It also initializes all the modules accordingly and passed
     *       control to other function to perform VPU operations based on AP commands.
     *
     *   \param
     *   \param
     *
     *   \return  
     */
    /*******************************************************************************/
    Void AIS_Main_Demo_start ()
    {
       VSYS_PARAMS_S vsysParams;
       //VDIS_PARAMS_S vdisParams;
       //UInt16  osdFormat[ALG_LINK_OSD_MAX_CH];
       //Int32 i;
    
       if(0 != access(VPU_SD_MOUNT_PT,F_OK|W_OK))
        {
          printf (VPU_SD_MOUNT_PT " Path does not exists \n");
          system ("mkdir -m 0755 "VPU_SD_MOUNT_PT);
        }
       
       if (0 != access (VPU_EXT_MOUNT_PT,F_OK|W_OK))
       {
          printf (VPU_EXT_MOUNT_PT " Path does not exists \n");
          system ("mkdir -m 0755 "VPU_EXT_MOUNT_PT);
       }
    
    
       /* Initialize all the RDK modules */
        Vsys_params_init(&vsysParams);
        //Vdis_params_init(&vdisParams);
    
       /* initialize the parameters needed for our customized usecase */ 
       gDemo_info.maxVdisChannels = 1;
       gDemo_info.VsysNumChs      = 1;
    
       //vdisParams.numChannels = 1; 
    
       ReadYUVInit();
    
       vsysParams.systemUseCase        = VSYS_USECASE_AIS_MAIN_DEMO;
       vsysParams.enableCapture        = FALSE;
       vsysParams.enableNsf            = FALSE;
       vsysParams.enableEncode         = FALSE;
       vsysParams.enableDecode         = FALSE;
       vsysParams.enableNullSrc        = FALSE;
       vsysParams.numDeis              = 0;
       vsysParams.deinterlaceFlag      = 0;
    
       vsysParams.numSwMs              = 0;
       vsysParams.numDisplays          = 1;
       vsysParams.enableSecondaryOut   = FALSE;
       vsysParams.enableMjpegEnc       = FALSE;
       vsysParams.enableSclr           = FALSE;
       vsysParams.enableOsd            = TRUE;
       vsysParams.enableScd            = FALSE;
       
       vsysParams.enableRunBuff        = FALSE;   
       /* TRUE: To enable saving 100 frames in DSP at  shared memory
          FALSE: To disable saving 100 frames in DSP at shared memory*/
    
    /*   printf ("--------------- CHANNEL DETAILS-------------\n");
       printf ("Disp Channels => %d\n", vdisParams.numChannels);
       printf ("-------------------------------------------\n");    
    */
       Vsys_init(&vsysParams);
       //Vdis_init(&vdisParams);
    
       /* Configure display */
       //Vsys_configureDisplay();
    
       /* Create Link instances and connect component blocks */
       Vsys_create();
    
       /* Register event handler */
       Vsys_registerEventHandler(Demo_eventHandler, NULL);
    
    
       if(vsysParams.enableRunBuff == TRUE )
       {
          if ( ERROR_NONE != Vsys_runbuf_init(&gRunBufInfo) )
          {
             OSA_printf(" %s:%s: ERROR : init running buffer failed\n",__FILE__,__func__);
          }
          else
          {
             OSA_printf("Enabled: Running buffer feature.\n");
          }
       }
       else
       {
          OSA_printf("Not Enabled: Running buffer feature.\n");
       }
    
    
       //UpdateCameraName((U8)CHANNEL_A);
       
        /* Start components in reverse order */
        //Vdis_start();
    
       //AP_VPU_Switch_Display_Channel((Int32)DISP_CHNA);  
    
       /* Reset statistics */
       //AIS_Main_Demo_resetStats ();
    
       signal(SIGALRM, AIS_Monitor_Framerate);
       alarm(10);
    }
    
    /*******************************************************************************/
    /*! \fn     Void AIS_Main_Demo_stop ()
     *
     *   \brief Stops all the modules and uninitializes respective modules and variables
    
     *   \param None
     *
     *   \return Void
     */
    /*******************************************************************************/
    Void AIS_Main_Demo_stop ()
    {
        VSYS_PARAMS_S contextInf;
        Vsys_getContext(&contextInf);
    
       ReadYUVdeInit();
    
       /* Stop RDK modules */
        Vcap_stop();
        Venc_stop();
    
        Vdis_stop();
    
       if (contextInf.enableOsd == TRUE)
       {
          Demo_osdDeinit(); 
       }
    
       /* Free Shared memory, allocated for running buffer*/
       if(contextInf.enableRunBuff == TRUE )
       {
          if( ERROR_NONE != Vsys_runbuf_deInit(&gRunBufInfo))
          {
             OSA_printf(" %s:%s: ERROR : failed to free running buffer memory\n" ,__FILE__,__func__);
          }
       }
    
       
       /* Uninitialize and stop streamer and recoder module */
       Strm_Rec_Delete (); 
       OSA_mutexDelete (&gRecStrm_ctrl.recStrmMtxHnd);
    
        Vsys_delete();
    
        /* De-initialize components */
        Vcap_exit();
        Venc_exit();
        Vsys_exit();
    
       /* Uninitialize RS232 receiver */
        RS232_DeInit();
    }
    
    
    /*******************************************************************************/
    /*! \fn      I32 GetRecordingSinceTime(U32 *recSince)
     *  
     *   \brief   This provides recording since started.
        
     *   \param     *recSince - Recrding since started
     *                                                                              
     *   \return  I32, VPU_OK on success, otherwise error code will be returned(as in vpu_values.h)..
     */ 
    /*******************************************************************************/
    I32 GetRecordingSinceTime(U32 *recSince)
    {
       time_t curTime;
       I32 status = VPU_OK;
       
       if(-1 == time(&curTime))
       {
          perror("Failed to get time of day \n");   
          VPU_ERR_BIT_MASK(status, VPU_REC_START_TIME_GET_FAIL);
       }
    
       *recSince = (U32)curTime - (U32)gRecStrm_ctrl.startTime;
    
       return status;
    }
    
    
    /*******************************************************************************/
    /*! \fn      I32 GetRecordingStatus(U8 *recStatus)
     *  
     *   \brief   This provides recording since started.
        
     *   \param     *recStatus - Recrding status
     *                                                                              
     *   \return  I32, VPU_OK on success, otherwise error code will be returned(as in vpu_values.h)..
     */ 
    /*******************************************************************************/
    I32 GetRecordingStatus(U8 *recStatus)
    {
       I32 status = VPU_OK;
    
       if(gRecStrm_ctrl.recording_fail)
       {  
          *recStatus = VPU_RECORDING_FAIL;
       }
       else
       {  
          if(TRUE == gRecStrm_ctrl.enableFileWrite)
             *recStatus = VPU_RECORDING_ACTV;
          else
             *recStatus = VPU_RECORDING_NOT_ACTV;
       }
    
       return status;
    }
    
    
    /******************************************************************************
    * Copyright (c) 2013 eInfochips - All rights reserved.
    *
    * This software is authored by eInfochips and is eInfochips' Int32ellectual
    * property, including the copyrights in all countries in the world.
    * This software is provided under a license to use only with all other rights,
    * including ownership rights, being retained by eInfochips.
    *
    * This file may not be distributed, copied, or reproduced in any manner,
    * electronic or otherwise, without the written consent of eInfochips.
    *****************************************************************************/
    

    **********************************************************************
    **************Starting AIS VPU demo application ***************
             Version: 4.0, Date: Mar  6 2014, Time: 11:26:35          
    **********************************************************************
    Start ReadYUVInit()
    Start FramesRdSendFxn()
    End ReadYUVInit()
     0: SYSTEM: System Common Init in progress !!!
     0: SYSTEM: IPC init in progress !!!
     20: SYSTEM: CPU [DSP] syslink proc ID is [0] !!!
     20: SYSTEM: CPU [VIDEO-M3] syslink proc ID is [1] !!!
     21: SYSTEM: CPU [VPSS-M3] syslink proc ID is [2] !!!
     21: SYSTEM: CPU [HOST] syslink proc ID is [3] !!!
     21: SYSTEM: Creating MsgQ Heap [IPC_MSGQ_MSG_HEAP_3] ...
     23: SYSTEM: Creating MsgQ [HOST_MSGQ] ...
     25: SYSTEM: Creating MsgQ [HOST_ACK_MSGQ] ...
     27: SYSTEM: Opening MsgQ [DSP_MSGQ] ...
     28: SYSTEM: Opening MsgQ [VIDEO-M3_MSGQ] ...
     28: SYSTEM: Opening MsgQ [VPSS-M3_MSGQ] ...
     29: SYSTEM: Notify register to [DSP] line 0, event 15 ... 
     30: SYSTEM: Notify register to [VIDEO-M3] line 0, event 15 ... 
     30: SYSTEM: Notify register to [VPSS-M3] line 0, event 15 ... 
     31: SYSTEM: IPC init DONE !!!
     32: SYSTEM: Creating ListMP [HOST_IPC_OUT_24] in region 0 ...
     34: SYSTEM: Creating ListMP [HOST_IPC_IN_24] in region 0 ...
     36: SYSTEM: ListElem Shared Addr = 0x414f0d00
     37: SYSTEM: Creating ListMP [HOST_IPC_OUT_25] in region 0 ...
     39: SYSTEM: Creating ListMP [HOST_IPC_IN_25] in region 0 ...
     41: SYSTEM: ListElem Shared Addr = 0x4150e800
     42: SYSTEM: Creating ListMP [HOST_IPC_OUT_19] in region 0 ...
     44: SYSTEM: Creating ListMP [HOST_IPC_IN_19] in region 0 ...
     46: SYSTEM: ListElem Shared Addr = 0x4152c300
     47: SYSTEM: Creating ListMP [HOST_IPC_OUT_20] in region 0 ...
     49: SYSTEM: Creating ListMP [HOST_IPC_IN_20] in region 0 ...
     51: SYSTEM: ListElem Shared Addr = 0x4154fb80
     52: SYSTEM: Creating ListMP [HOST_IPC_OUT_21] in region 0 ...
     54: SYSTEM: Creating ListMP [HOST_IPC_IN_21] in region 0 ...
     56: SYSTEM: ListElem Shared Addr = 0x41573400
     78: SYSTEM: System Common Init Done !!!
     144: MCFW  : CPU Revision [ES2.1] !!! 
     144: MCFW  : Detected [UNKNOWN] Board !!! 
     144: MCFW  : Base Board Revision [REV A] !!! 
    
     [host]  144: IPC_FRAMES_OUT   : Create in progress !!!
    
     [host]  146: IPC_FRAMES_OUT   : Create Done !!!
     [c6xdsp ]  40819: IPC_FRAMES_IN   : Create in progress !!!
     [c6xdsp ]  40820: SYSTEM: Opening ListMP [HOST_IPC_OUT_19] ...
     [c6xdsp ]  40820: SYSTEM: Opening ListMP [HOST_IPC_IN_19] ...
     [c6xdsp ]  40821: SYSTEM: Opening MsgQ [HOST_MSGQ] ...
    
     [host] Not Enabled: Running buffer feature.
     [c6xdsp ] IPC_FRAMES_IN:HEAPID:0       USED:304
     [c6xdsp ]  40823: IPC_FRAMES_IN   : Create Done !!!
     [c6xdsp ] AlgLink_tskMain()
     [c6xdsp ]  40823: ALG : Create in progress !!!
     [c6xdsp ]  40836: ALG : Create Done !!!
    
    

  • You should post the thrStartSem in your application.I don't see it in your app.

    Refer VdecVdis_ipcFramesStart in  /dvr_rdk/demos/mcfw_api_demos/mcfw_demo/demo_vdec_vdis_frames_send.c