This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

About DM8127 ipncV2.8 oneshot mode M2M

Genius 3400 points

Hi all!

    About the Release_Notes_Version2.8.0.pdf, ISS driver support oneshot mode.

How can I enable camera running in the oneshot mode mem-to-mem ipipe+resize ?

Anybody give us some advice?

  • Hi Gomo,

    In the IPNC_RDK, if you enable "Dynamic Range Enhancement" this will be an example of ISP in one-shot mode.

    In this usecase - the RAW capture first goes to the DDR and then DDR->ISP-DDR happens in one memory to memory mode.

    Regards

    Rajat

  • Thanks for Rajat Sagar very much.

    "Dynamic Range Enhancement" can be enbale in the WEB->Settings->Camera->Dynamic Range Enhancement.

    But in the IPNC_RDK,if "Dynamic Range Enhancement" is changed enable or disable,there do nothing.

    The implementation code are as follows:
    void stream_feature_setup(int nFeature, void *pParm)
    {
        ......
        case STREAM_FEATURE_DYNRANGE:
        {
            // VIDEO_dynRangePrm((DynRangePrm*)pParm);
            break;
        }
        ......
     

    VIDEO_dynRangePrm() function does not run,and the VIDEO_dynRangePrm() function is not implemented in the encode.h file.

    We donot know how to run the oneshot mode mem-to-mem ipipe+resize.
    Can you give us more advice? And where are the codes about the oneshot mode mem-to-mem?
    Thanks a lot!

  • Hi Gomo,

    I was talking about the "Dynamic Range Enhancement" feature selection on  the 'Camera' page on the IPNC activeX web-application.

    Can you please review the usecase in 'ipnc_mcfw\mcfw\src_linux\mcfw_api\usecases\multich_tristream_fullFeature.c'.

    You need to check the implementation of gUI_mcfw_config.glbceEnable == TRUE scenario. In this scenario the M2M mode of ISP is implemented.

    Regards

    Rajat

  • Thanks for Rajat Sagar very much.

    Now,in LOW POWER USECASE mode + output H264(1080p)+D1+MJPEG(1080p) and enable "Dynamic Range Enhancement",

    the camera can run M2M mode.and it runs ok.
    But when the camera output H264(5M)+D1+MJPEG(5M) and enable "Dynamic Range Enhancement",

    the camera can not run.The print error are as follows:

      [m3video] 261837: ENCODE: Create in progress ... !!!
      [m3video] ENCLINK:INFO: !!!Number of output buffers for ch[2] set to [1]
      [m3video] 262019: ENCODE: Creating CH0 of 2560 x 1920, pitch = (2560, 2560) [PROGRESSIVE] [NON-TILED ], bitrate = 2000 Kbps ...
      [m3video] ENCLINK_H264:HEAPID:0 USED:128584
      [m3video] 262169: ENCODE: Creating CH1 of 720 x 480, pitch = (720, 720) [PROGRESSIVE] [NON-TILED ], bitrate = 2000 Kbps ...
      [m3video]
      [m3video] 262263:ERR::linkID:10000024::channelID:1::errorCode:-4::FileName:links_m3video/iva_enc/encLink_h264.c::linuNum:1299::errorCon dition:(iresStatus == IRES_OK)
      [m3video] 262264: Assertion @ Line: 907 in links_m3video/iva_enc/encLink_common.c: retVal == ENC_LINK_S_SUCCESS : failed !!!
      ApproDrvInit: 7
      queue id:196612
      func_get_mem.mem_info.addr = 0x85000000
      func_get_mem.mem_info.size = 0x3b00000
      ApproDrvExit: 7
      Error: WaitStreamReady Fail.

    In LOW POWER USECASE mode and output H264(5M)+D1+MJPEG(5M),is it not support "Dynamic Range Enhancement"?

    Thanks a lot!

  • The M2M mode for ISP will run fine and has been tested will 10M resolution. But the Dynamic Range Enhancement or GLBCE function will not work for > 1080P resolutions. So if you want to use only the M2M mode you can remove the GLBCE link and feature and then the usecase should work fine.

    Regards

    Rajat

  • Thanks for your apply.

     But in the document of SALDRE_description.doc (ti_tools/iss_02_80_00_00/packages/ti/psp/iss/alg/glbce/docs), It introduce like this:
    SAL-DRE performance
    On DM812x IPNC (200 Mhz M3_ISS and 400 Mhz of DDR), SAL-DRE supports up to 1080P resolution at 30 fps. Higher resolution are supported but at lower frame rate . Also enabling ISS spatial or temporal noise filter will further decrease the frame rate.

    It means that it can support higher resolution. In our application, we only use 15fps. So can you tell me the limitation of IPNC Dynamic Range Enhancement or GLBCE function?

  • Gomo,

    The upcomming ver.3.0 (GA) release will have the SALDRE tested for hi-mega. There might have been some bugs creeped into the beta release since in Beta the hi-megapixel testing was not done.

    We have it working currently for higher resolutions.

    Regards,

    Rajat

  • Thanks a lot.

    We had got hi-mega data with M2E, But open the GLBCE, the image become dark and fps down (15 to 13).

    We will wait for the Ver3.0 in the process of our program develope.

    And can you tell me when it plan to come out?

  • Current plan is to make the release around 11 June.

    Regards

    Rajat

  •  Hi all
    The camera runs m2m mode and outputs H264(1080p)+D1+MJPEG(1080p),

    when we set RAW data getted from IspLink_drvProcessFrames() funcion to 0.

    But the mjpeg stream is not what  we wished.
    The modified code are as follows:
    Int32 IspLink_drvProcessFrames(IspLink_Obj * pObj)
    {
        ......
      #if 1 //add
          int Inwidth = 0;
          int InHeigt= 0;
          int size = 0;
          char *inAddr = NULL;

          Inwidth = rszCfg.inWidth ;  //rszCfg.inWidth is 1920
          InHeigt = rszCfg.inHeight ;  //rszCfg.inHeight is 1080
          size = Inwidth*InHeigt;
          inAddr = (char *)pDrvObj->processList.inFrameList[0]->frames[0]->addr[0][0];

          memset(inAddr,val,size);
      #endif //end add

        /* Process the frames */
         status = FVID2_processFrames(pDrvObj->fvidHandle, &pDrvObj->processList);
         UTILS_assert(status == FVID2_SOK);
    .    .....
    }
    The camera output MJPEG stream is as follow:

    The half of the screen is dark.We have memset bufsize(1920*1080)  to 0 ,why is not all of the screen drak?

  • What is the functionality you are trying to achive?

    I suspect that the FILL operation is not working on all buffers. Can you please add details on what needs to be done and what changes you have made?

    Thanks,

    Rajat

  • Thanks for your apply.

    when capture we has  added frame info at the end of the RAW data ,then get the frame info from RAW data before raw to yuv,

    then copy the frame info to the end of the  yuv data.

    So we want to know the size of  RAW data for getting frame info date.

    When the camera outputs 1080p,the RAW data size is 1920*1080 or 1920*1080*2?

    And we modify the code,just as before, to sure the raw date size.

    Thanks a lot

  • For RAW the data size is width*height*2.

    Regards

    Rajat

  • Thanks for your reply.

    Rajat Sagar said:

    For RAW the data size is width*height*2.

    Regards

    Rajat

    It means that capture RAW data format is Bit16, is it correctly? Is there other reason that you use this format to capture?

    If we want capture RAW8, is it feasible?  where do we must to modify?

    Thanks a lot!

  • We are capturing 12-bit RAW, but the buffer allocation is done for 16-bits. 

    You can do RAW8 capture as well - It is feasable. Please go through the ISIF guide in ISS TRM to understand the requierd changes.

    Thanks,

    Regards, 
    Rajat 

  • Thanks for your rapid answer.

    I'll try to change.

  • Thank for  .

    We change the iss registers value to capture RAW8.The modified code are as follows:

    1、

    ISP_RETURN IssIsifParamsInitPreview(....)

    {

        ......

    #if 0 //raw16
        default_preview_sdram_op_params.dpcm_enable = ISIF_DPCM_ENCODER_ENABLE_OFF;
        #ifdef SENSOR_12BIT
             default_preview_sdram_op_params.sdram_pack_fmt = ISIF_SDRAM_PACK_12BITS;
        #else
            default_preview_sdram_op_params.sdram_pack_fmt = ISIF_SDRAM_PACK_16BITS;
        #endif

        default_preview_sdram_op_params.memory_addr_offset = sizes->ppln * 2;

       #ifdef USE_PARALLEL_VIDEO_PORT
           default_preview_sdram_op_params.ccd_raw_shift_value = ISIF_CCD_DATA_NO_SHIFT;

       #endif

    #else //raw8
        default_preview_sdram_op_params.dpcm_enable = ISIF_DPCM_ENCODER_ENABLE_OFF;
        default_preview_sdram_op_params.sdram_pack_fmt = ISIF_SDRAM_PACK_8BITS;

        default_preview_sdram_op_params.memory_addr_offset = sizes->ppln;
        default_preview_sdram_op_params.ccd_raw_shift_value =ISIF_CCD_DATA_RIGHT_SHIFT_4 ;
          //ISIF_CCD_DATA_RIGHT_SHIFT_4;ISIF_CCD_DATA_NO_SHIFT

    #endif

        ......
    }

    2、

    set ipipeif_reg->ADOFS = 0x50;

    3、

    set isif_reg->HSIZE = 0x50;

    But the output stream is as follow:

    In the isplink, before RAW->yuv,the RAW buffer data size is width*height, but the buffer data is not right.

    The output stream is as before.

     Where should we must modify also to capture RAW8?

    Can you give us more advices?

    Thank you very much.