This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TDA4VMXEVM: [TIDL] import caffe model to tidl model error

Part Number: TDA4VMXEVM

Dear TI,

Now I have downlaod a public model (alexnet) from github, you know it's a simple & small model.but import error,

there is a un-support layer LRN,but some other error I just don't understand, the log is as below:

Caffe Network File : ../../test/testvecs/models/public/caffe/alexnet/bvlc_alexnet_deploy.prototxt
Caffe Model File : ../../test/testvecs/models/public/caffe/alexnet/bvlc_alexnet_no_group.caffemodel
TIDL Network File : ../../test/testvecs/config/tidl_models/caffe/tidl_net_bvlc_alexnet_no_group.caffemodel.bin
TIDL IO Info File : ../../test/testvecs/config/tidl_models/caffe/tidl_io_bvlc_alexnet_no_group.caffemodel_
Name of the Network : AlexNet
WARNING: detect batch process from input size, overwrite to single frame inference!
ERROR: LRN layer norm2 is not suported now.. By passing
WARNING: !!!!!!! Not supported layer LRN:norm2 is found! Import result is unpredictable!
ERROR: LRN layer norm1 is not suported now.. By passing
WARNING: !!!!!!! Not supported layer LRN:norm1 is found! Import result is unpredictable!
WARNING: Conv Layer conv2's coeff cannot be found(or not match) in coef file, Random coeff will be generated! Only for evaluation usage! Results are all random!
WARNING: Conv Layer conv4's coeff cannot be found(or not match) in coef file, Random coeff will be generated! Only for evaluation usage! Results are all random!
WARNING: Conv Layer conv5's coeff cannot be found(or not match) in coef file, Random coeff will be generated! Only for evaluation usage! Results are all random!
In put of TIDL_InnerProductLayer layer needs to be Faltten. Please add Flatten layer to import this mdoel

I have 2 questions

1、the conv2's coeff can't be found,I have add some log before the err line

 printf("[%s] %#x,bufSize=%d,size=%d\n",TIDLPCLayers.name, TIDLPCLayers.weights.ptr, TIDLPCLayers.weights.bufSize, dataSize);
if(TIDLPCLayers.weights.ptr == NULL ||
TIDLPCLayers.weights.bufSize != dataSize)
{
printf("WARNING: Conv Layer %s's coeff cannot be found(or not match) in coef file, "
"Random coeff will be generated! "
"Only for evaluation usage! "
"Results are all random!\n", TIDLPCLayers.name);

...

}

the log after import is like this [conv2] 0xd281010,bufSize=614400,size=307200,

so the condition doesn't match,please tell me where is the param from ,is from the prototxt? is there any document  can refer to.

2、as the err log "TIDL_InnerProductLayer layer needs to be Faltten." should I only add this layer before TIDL_InnerProductLayer  in prototxt ?and the caffe model don't have any change?

TKS

Nigel

  • Hi, Nigel,

    LRN layer is not supported in TIDL, So Alex net import is failing.

    Please refer the below for 2o+ Validated public models in TIDL.

    http://software-dl.ti.com/jacinto7/esd/processor-sdk-rtos-jacinto7/latest/exports/docs/tidl_j7_01_01_00_10/ti_dl/docs/user_guide_html/md_tidl_models_info.html

  • Kumar

      I have konw LRN is un-support layer yet,so I want to add this layer to custom layer.

    so I asked 2 questions.

    Nigel

  • Hi,

    1. Your prototxt is not match with your caffemodel, so import tool cannot find correct param from your caffemodel.

    2. you only need to modify your prototxt. add a flatten layer before inner product layer.

    Thanks & Best Regards!

    ZM

  • Hi Ming,

    I have replace this model with a no group caffemodel.now the log is

    so as you suggest add a flatten before  inner product layer,the prototxt and import.cfg as attachment.

    it's seem generate the bin success,but there are other errors as below:

    ERROR: LRN layer norm2 is not suported now.. By passing
    WARNING: !!!!!!! Not supported layer LRN:norm2 is found! Import result is unpredictable!
    ERROR: LRN layer norm1 is not suported now.. By passing
    WARNING: !!!!!!! Not supported layer LRN:norm1 is found! Import result is unpredictable!
    Could not find Indata for data ID 2 
    Could not find Indata for data ID 5

    ~~~~~Running TIDL in PC emulation mode to collect Activations range for each layer~~~~~

    Processing config file #0 : /home/work/psdk_rtos_auto_j7_06_01_01_12/tidl_j7_01_00_01_00/ti_dl/utils/tidlModelImport/tempDir/qunat_stats_config.txt 
    ----------------------- TIDL Process with REF_ONLY FLOW------------------------


    OpenCV Error: Assertion failed (0 <= roi.x && 0 <= roi.width && roi.x + roi.width <= m.cols && 0 <= roi.y && 0 <= roi.height && roi.y + roi.height <= m.rows) in Mat, file /home/work/ti_tools/opencv-3.1.0/modules/core/src/matrix.cpp, line 508
    terminate called after throwing an instance of 'cv::Exception'
    what(): /home/work/ti_tools/opencv-3.1.0/modules/core/src/matrix.cpp:508: error: (-215) 0 <= roi.x && 0 <= roi.width && roi.x + roi.width <= m.cols && 0 <= roi.y && 0 <= roi.height && roi.y + roi.height <= m.rows in function Mat

    Aborted (core dumped)

    is the err log happen cause by LRN does't add to custom layer?  in my opinion,the LRN is bypass,so I think it's should be runing ok.

    do you have any doc can guide there steps.TKS.

    BR.

  • the tidl import is as below,it's modify from mobilenetv2.txt

    modelType = 0
    numParamBits = 8
    inputNetFile = "../../test/testvecs/models/public/caffe/alexnet/bvlc_alexnet_no_group_deploy.prototxt"
    inputParamsFile = "../../test/testvecs/models/public/caffe/alexnet/bvlc_alexnet_no_group.caffemodel"
    outputNetFile = "../../test/testvecs/config/tidl_models/caffe/alexnet/tidl_net_alexnet.bin"
    outputParamsFile = "../../test/testvecs/config/tidl_models/caffe/alexnet/tidl_io_alexnet_"
    inDataNorm = 1
    inMean = 103.94, 116.78, 123.68
    inScale = 0.017 0.017 0.017
    inDataFormat = 0
    resizeWidth = 224
    resizeHeight = 224
    inNumChannels = 3
    inData = ../../test/testvecs/config/imageNet_sample_val.txt
    postProcType = 1

    the prototxt  add a flatten layer and modify as below

    layer {
    name: "conf_flat"
    type: "Flatten"
    bottom: "pool5"
    top: "conf_flat"
    flatten_param {
    axis: 1
    }
    }

    layer {
    name: "fc6"
    type: "InnerProduct"
    bottom: "conf_flat"
    top: "fc6"
    param {
    lr_mult: 1
    decay_mult: 1
    }
    param {
    lr_mult: 2
    decay_mult: 0
    }
    inner_product_param {
    num_output: 4096
    }
    }

  • Per discussed through mail. Can we close this issue?

    You should remove all unsupported layer before import. OR add unsupported layer as custom layer in import tool before import.