This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

AM5728: Caffe-Jacinto JDetNet model trained on custom dataset runs slower than expected.

Part Number: AM5728

Tool/software:

Hi,

I trained a custom JDetNet model using the training scripts provided in caffe-jacinto on a subset of the FLIR-ADAS dataset but it runs slower than expected.

Specifically, I am comparing two model imports:

First is the already trained model provided under caffe-jacinto-models/trained/object_detection/voc0712/JDetNet/ssd512x512_ds_PSP_dsFac_32_fc_0_hdDS8_1_kerMbox_3_1stHdSameOpCh_1/sparse/voc0712_ssdJacintoNetV2_iter_104000.caffemodel

Second model is the one I trained on custom dataset for 94000 iterations.

First model takes about 1-3 second to execute per frame but my model takes 5 seconds per frame. Also this stat didn't even change during the 94000 sparse iterations. Even when I use a model snapshot before the sparse training it still executes in similar time so i suspect i must be doing something wrong while importing.

I am providing the tidl_model_import.out outputs and my config.txt

=============================== TIDL import - parsing ===============================

Caffe Network File : deploy.prototxt  
Caffe Model File   : weights.caffemodel  
TIDL Network File  : ./out/tidl_net_jdetNet_ssd_512x512.bin  
TIDL Model File    : ./out/tidl_param_jdetNet_ssd_512x512.bin  
Name of the Network : ssdJacintoNetV2_deploy 
Num Inputs :               1 

Error in DetectionOutput layer: could not find parameters for detection_out!
 Num of Layer Detected :  57 
  0, TIDL_DataLayer                , data                                      0,  -1 ,  1 ,   x ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  0 ,       0 ,       0 ,       0 ,       0 ,       1 ,       3 ,     512 ,     512 ,         0 ,
  1, TIDL_BatchNormLayer           , data/bias                                 1,   1 ,  1 ,   0 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  1 ,       1 ,       3 ,     512 ,     512 ,       1 ,       3 ,     512 ,     512 ,    786432 ,
  2, TIDL_ConvolutionLayer         , conv1a                                    1,   1 ,  1 ,   1 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  2 ,       1 ,       3 ,     512 ,     512 ,       1 ,      32 ,     256 ,     256 , 157286400 ,
  3, TIDL_ConvolutionLayer         , conv1b                                    1,   1 ,  1 ,   2 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  3 ,       1 ,      32 ,     256 ,     256 ,       1 ,      32 ,     128 ,     128 , 150994944 ,
  4, TIDL_ConvolutionLayer         , res2a_branch2a                            1,   1 ,  1 ,   3 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  4 ,       1 ,      32 ,     128 ,     128 ,       1 ,      64 ,     128 ,     128 , 301989888 ,
  5, TIDL_ConvolutionLayer         , res2a_branch2b                            1,   1 ,  1 ,   4 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  5 ,       1 ,      64 ,     128 ,     128 ,       1 ,      64 ,      64 ,      64 , 150994944 ,
  6, TIDL_ConvolutionLayer         , res3a_branch2a                            1,   1 ,  1 ,   5 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  6 ,       1 ,      64 ,      64 ,      64 ,       1 ,     128 ,      64 ,      64 , 301989888 ,
  7, TIDL_ConvolutionLayer         , res3a_branch2b                            1,   1 ,  1 ,   6 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  7 ,       1 ,     128 ,      64 ,      64 ,       1 ,     128 ,      64 ,      64 , 150994944 ,
  8, TIDL_PoolingLayer             , pool3                                     1,   1 ,  1 ,   7 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  8 ,       1 ,     128 ,      64 ,      64 ,       1 ,     128 ,      32 ,      32 ,    524288 ,
  9, TIDL_ConvolutionLayer         , res4a_branch2a                            1,   1 ,  1 ,   8 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  9 ,       1 ,     128 ,      32 ,      32 ,       1 ,     256 ,      32 ,      32 , 301989888 ,
 10, TIDL_ConvolutionLayer         , res4a_branch2b                            1,   1 ,  1 ,   9 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 10 ,       1 ,     256 ,      32 ,      32 ,       1 ,     256 ,      16 ,      16 , 150994944 ,
 11, TIDL_ConvolutionLayer         , res5a_branch2a                            1,   1 ,  1 ,  10 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 11 ,       1 ,     256 ,      16 ,      16 ,       1 ,     512 ,      16 ,      16 , 301989888 ,
 12, TIDL_ConvolutionLayer         , res5a_branch2b                            1,   1 ,  1 ,  11 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 12 ,       1 ,     512 ,      16 ,      16 ,       1 ,     512 ,      16 ,      16 , 150994944 ,
 13, TIDL_PoolingLayer             , pool6                                     1,   1 ,  1 ,  12 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 13 ,       1 ,     512 ,      16 ,      16 ,       1 ,     512 ,       8 ,       8 ,    131072 ,
 14, TIDL_PoolingLayer             , pool7                                     1,   1 ,  1 ,  13 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 14 ,       1 ,     512 ,       8 ,       8 ,       1 ,     512 ,       4 ,       4 ,     32768 ,
 15, TIDL_PoolingLayer             , pool8                                     1,   1 ,  1 ,  14 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 15 ,       1 ,     512 ,       4 ,       4 ,       1 ,     512 ,       2 ,       2 ,      8192 ,
 16, TIDL_PoolingLayer             , pool9                                     1,   1 ,  1 ,  15 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 16 ,       1 ,     512 ,       2 ,       2 ,       1 ,     512 ,       1 ,       1 ,      2048 ,
 17, TIDL_ConvolutionLayer         , ctx_output1                               1,   1 ,  1 ,   7 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 17 ,       1 ,     128 ,      64 ,      64 ,       1 ,     256 ,      64 ,      64 , 134217728 ,
 18, TIDL_ConvolutionLayer         , ctx_output2                               1,   1 ,  1 ,  12 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 18 ,       1 ,     512 ,      16 ,      16 ,       1 ,     256 ,      16 ,      16 ,  33554432 ,
 19, TIDL_ConvolutionLayer         , ctx_output3                               1,   1 ,  1 ,  13 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 19 ,       1 ,     512 ,       8 ,       8 ,       1 ,     256 ,       8 ,       8 ,   8388608 ,
 20, TIDL_ConvolutionLayer         , ctx_output4                               1,   1 ,  1 ,  14 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 20 ,       1 ,     512 ,       4 ,       4 ,       1 ,     256 ,       4 ,       4 ,   2097152 ,
 21, TIDL_ConvolutionLayer         , ctx_output5                               1,   1 ,  1 ,  15 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 21 ,       1 ,     512 ,       2 ,       2 ,       1 ,     256 ,       2 ,       2 ,    524288 ,
 22, TIDL_ConvolutionLayer         , ctx_output6                               1,   1 ,  1 ,  16 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 22 ,       1 ,     512 ,       1 ,       1 ,       1 ,     256 ,       1 ,       1 ,    131072 ,
 23, TIDL_ConvolutionLayer         , ctx_output1/relu_mbox_loc                 1,   1 ,  1 ,  17 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 23 ,       1 ,     256 ,      64 ,      64 ,       1 ,      16 ,      64 ,      64 , 150994944 ,
 24, TIDL_FlattenLayer             , ctx_output1/relu_mbox_loc_perm            1,   1 ,  1 ,  23 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 24 ,       1 ,      16 ,      64 ,      64 ,       1 ,       1 ,       1 ,   65536 ,         1 ,
 25, TIDL_ConvolutionLayer         , ctx_output1/relu_mbox_conf                1,   1 ,  1 ,  17 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 25 ,       1 ,     256 ,      64 ,      64 ,       1 ,      12 ,      64 ,      64 , 113246208 ,
 26, TIDL_FlattenLayer             , ctx_output1/relu_mbox_conf_perm           1,   1 ,  1 ,  25 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 26 ,       1 ,      12 ,      64 ,      64 ,       1 ,       1 ,       1 ,   49152 ,         1 ,
 28, TIDL_ConvolutionLayer         , ctx_output2/relu_mbox_loc                 1,   1 ,  1 ,  18 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 28 ,       1 ,     256 ,      16 ,      16 ,       1 ,      24 ,      16 ,      16 ,  14155776 ,
 29, TIDL_FlattenLayer             , ctx_output2/relu_mbox_loc_perm            1,   1 ,  1 ,  28 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 29 ,       1 ,      24 ,      16 ,      16 ,       1 ,       1 ,       1 ,    6144 ,         1 ,
 30, TIDL_ConvolutionLayer         , ctx_output2/relu_mbox_conf                1,   1 ,  1 ,  18 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 30 ,       1 ,     256 ,      16 ,      16 ,       1 ,      18 ,      16 ,      16 ,  10616832 ,
 31, TIDL_FlattenLayer             , ctx_output2/relu_mbox_conf_perm           1,   1 ,  1 ,  30 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 31 ,       1 ,      18 ,      16 ,      16 ,       1 ,       1 ,       1 ,    4608 ,         1 ,
 33, TIDL_ConvolutionLayer         , ctx_output3/relu_mbox_loc                 1,   1 ,  1 ,  19 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 33 ,       1 ,     256 ,       8 ,       8 ,       1 ,      24 ,       8 ,       8 ,   3538944 ,
 34, TIDL_FlattenLayer             , ctx_output3/relu_mbox_loc_perm            1,   1 ,  1 ,  33 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 34 ,       1 ,      24 ,       8 ,       8 ,       1 ,       1 ,       1 ,    1536 ,         1 ,
 35, TIDL_ConvolutionLayer         , ctx_output3/relu_mbox_conf                1,   1 ,  1
Processing config file ./tempDir/qunat_stats_config.txt !

Running TIDL simulation for calibration. 

  0, TIDL_DataLayer                ,  0,  -1 ,  1 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  0 ,    0 ,    0 ,    0 ,    0 ,    1 ,    3 ,  512 ,  512 ,
  1, TIDL_BatchNormLayer           ,  1,   1 ,  1 ,  0 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  1 ,    1 ,    3 ,  512 ,  512 ,    1 ,    3 ,  512 ,  512 ,
  2, TIDL_ConvolutionLayer         ,  1,   1 ,  1 ,  1 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  2 ,    1 ,    3 ,  512 ,  512 ,    1 ,   32 ,  256 ,  256 ,
  3, TIDL_ConvolutionLayer         ,  1,   1 ,  1 ,  2 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  3 ,    1 ,   32 ,  256 ,  256 ,    1 ,   32 ,  128 ,  128 ,
  4, TIDL_ConvolutionLayer         ,  1,   1 ,  1 ,  3 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  4 ,    1 ,   32 ,  128 ,  128 ,    1 ,   64 ,  128 ,  128 ,
  5, TIDL_ConvolutionLayer         ,  1,   1 ,  1 ,  4 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  5 ,    1 ,   64 ,  128 ,  128 ,    1 ,   64 ,   64 ,   64 ,
  6, TIDL_ConvolutionLayer         ,  1,   1 ,  1 ,  5 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  6 ,    1 ,   64 ,   64 ,   64 ,    1 ,  128 ,   64 ,   64 ,
  7, TIDL_ConvolutionLayer         ,  1,   1 ,  1 ,  6 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  7 ,    1 ,  128 ,   64 ,   64 ,    1 ,  128 ,   64 ,   64 ,
  8, TIDL_PoolingLayer             ,  1,   1 ,  1 ,  7 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  8 ,    1 ,  128 ,   64 ,   64 ,    1 ,  128 ,   32 ,   32 ,
  9, TIDL_ConvolutionLayer         ,  1,   1 ,  1 ,  8 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  9 ,    1 ,  128 ,   32 ,   32 ,    1 ,  256 ,   32 ,   32 ,
 10, TIDL_ConvolutionLayer         ,  1,   1 ,  1 ,  9 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 10 ,    1 ,  256 ,   32 ,   32 ,    1 ,  256 ,   16 ,   16 ,
 11, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 10 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 11 ,    1 ,  256 ,   16 ,   16 ,    1 ,  512 ,   16 ,   16 ,
 12, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 11 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 12 ,    1 ,  512 ,   16 ,   16 ,    1 ,  512 ,   16 ,   16 ,
 13, TIDL_PoolingLayer             ,  1,   1 ,  1 , 12 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 13 ,    1 ,  512 ,   16 ,   16 ,    1 ,  512 ,    8 ,    8 ,
 14, TIDL_PoolingLayer             ,  1,   1 ,  1 , 13 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 14 ,    1 ,  512 ,    8 ,    8 ,    1 ,  512 ,    4 ,    4 ,
 15, TIDL_PoolingLayer             ,  1,   1 ,  1 , 14 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 15 ,    1 ,  512 ,    4 ,    4 ,    1 ,  512 ,    2 ,    2 ,
 16, TIDL_PoolingLayer             ,  1,   1 ,  1 , 15 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 16 ,    1 ,  512 ,    2 ,    2 ,    1 ,  512 ,    1 ,    1 ,
 17, TIDL_ConvolutionLayer         ,  1,   1 ,  1 ,  7 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 17 ,    1 ,  128 ,   64 ,   64 ,    1 ,  256 ,   64 ,   64 ,
 18, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 12 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 18 ,    1 ,  512 ,   16 ,   16 ,    1 ,  256 ,   16 ,   16 ,
 19, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 13 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 19 ,    1 ,  512 ,    8 ,    8 ,    1 ,  256 ,    8 ,    8 ,
 20, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 14 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 20 ,    1 ,  512 ,    4 ,    4 ,    1 ,  256 ,    4 ,    4 ,
 21, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 15 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 21 ,    1 ,  512 ,    2 ,    2 ,    1 ,  256 ,    2 ,    2 ,
 22, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 16 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 22 ,    1 ,  512 ,    1 ,    1 ,    1 ,  256 ,    1 ,    1 ,
 23, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 17 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 23 ,    1 ,  256 ,   64 ,   64 ,    1 ,   16 ,   64 ,   64 ,
 24, TIDL_FlattenLayer             ,  1,   1 ,  1 , 23 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 24 ,    1 ,   16 ,   64 ,   64 ,    1 ,    1 ,    1 ,65536 ,
 25, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 17 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 25 ,    1 ,  256 ,   64 ,   64 ,    1 ,   12 ,   64 ,   64 ,
 26, TIDL_FlattenLayer             ,  1,   1 ,  1 , 25 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 26 ,    1 ,   12 ,   64 ,   64 ,    1 ,    1 ,    1 ,49152 ,
 27, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 18 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 28 ,    1 ,  256 ,   16 ,   16 ,    1 ,   24 ,   16 ,   16 ,
 28, TIDL_FlattenLayer             ,  1,   1 ,  1 , 28 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 29 ,    1 ,   24 ,   16 ,   16 ,    1 ,    1 ,    1 , 6144 ,
 29, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 18 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 30 ,    1 ,  256 ,   16 ,   16 ,    1 ,   18 ,   16 ,   16 ,
 30, TIDL_FlattenLayer             ,  1,   1 ,  1 , 30 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 31 ,    1 ,   18 ,   16 ,   16 ,    1 ,    1 ,    1 , 4608 ,
 31, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 19 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 33 ,    1 ,  256 ,    8 ,    8 ,    1 ,   24 ,    8 ,    8 ,
 32, TIDL_FlattenLayer             ,  1,   1 ,  1 , 33 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 34 ,    1 ,   24 ,    8 ,    8 ,    1 ,    1 ,    1 , 1536 ,
 33, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 19 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 35 ,    1 ,  256 ,    8 ,    8 ,    1 ,   18 ,    8 ,    8 ,
 34, TIDL_FlattenLayer             ,  1,   1 ,  1 , 35 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 36 ,    1 ,   18 ,    8 ,    8 ,    1 ,    1 ,    1 , 1152 ,
 35, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 20 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 38 ,    1 ,  256 ,    4 ,    4 ,    1 ,   24 ,    4 ,    4 ,
 36, TIDL_FlattenLayer             ,  1,   1 ,  1 , 38 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 39 ,    1 ,   24 ,    4 ,    4 ,    1 ,    1 ,    1 ,  384 ,
 37, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 20 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 40 ,    1 ,  256 ,    4 ,    4 ,    1 ,   18 ,    4 ,    4 ,
 38, TIDL_FlattenLayer             ,  1,   1 ,  1 , 40 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 41 ,    1 ,   18 ,    4 ,    4 ,    1 ,    1 ,    1 ,  288 ,
 39, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 21 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 43 ,    1 ,  256 ,    2 ,    2 ,    1 ,   16 ,    2 ,    2 ,
 40, TIDL_FlattenLayer             ,  1,   1 ,  1 , 43 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 44 ,    1 ,   16 ,    2 ,    2 ,    1 ,    1 ,    1 ,   64 ,
 41, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 21 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 45 ,    1 ,  256 ,    2 ,    2 ,    1 ,   12 ,    2 ,    2 ,
 42, TIDL_FlattenLayer             ,  1,   1 ,  1 , 45 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 46 ,    1 ,   12 ,    2 ,    2 ,    1 ,    1 ,    1 ,   48 ,
 43, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 22 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 48 ,    1 ,  256 ,    1 ,    1 ,    1 ,   16 ,    1 ,    1 ,
 44, TIDL_FlattenLayer             ,  1,   1 ,  1 , 48 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 49 ,    1 ,   16 ,    1 ,    1 ,    1 ,    1 ,    1 ,   16 ,
 45, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 22 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 50 ,    1 ,  256 ,    1 ,    1 ,    1 ,   12 ,    1 ,    1 ,
 46, TIDL_FlattenLayer             ,  1,   1 ,  1 , 50 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 51 ,    1 ,   12 ,    1 ,    1 ,    1 ,    1 ,    1 ,   12 ,
 47, TIDL_ConcatLayer              ,  1,   6 ,  1 , 24 , 29 , 34 , 39 , 44 , 49 ,  x ,  x , 53 ,    1 ,    1 ,    1 ,65536 ,    1 ,    1 ,    1 ,73680 ,
 48, TIDL_ConcatLayer              ,  1,   6 ,  1 , 26 , 31 , 36 , 41 , 46 , 51 ,  x ,  x , 54 ,    1 ,    1 ,    1 ,49152 ,    1 ,    1 ,    1 ,55260 ,
 49, TIDL_DetectionOutputLayer     ,  1,   2 ,  1 , 53 , 54 ,  x ,  x ,  x ,  x ,  x ,  x , 56 ,    1 ,    1 ,    1 ,73680 ,    1 ,    1 ,    1 , 5600 ,
 50, TIDL_DataLayer                ,  0,   1 , -1 , 56 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  0 ,    1 ,    1 ,    1 , 5600 ,    0 ,    0 ,    0 ,    0 ,
Layer ID    ,inBlkWidth  ,inBlkHeight ,inBlkPitch  ,outBlkWidth ,outBlkHeight,outBlkPitch ,numInChs    ,numOutChs   ,numProcInChs,numLclInChs ,numLclOutChs,numProcItrs ,numAccItrs  ,numHorBlock ,numVerBlock ,inBlkChPitch,outBlkChPitc,alignOrNot 
      2           72           72           72           32           32           32            3           32            3            1            8            1            3            8            8         5184         1024            1    
      3           40           34           40           32           32           32            8            8            8            4            8            1            2            8            8         1360         1024            1    
      4           40           34           40           32           32           32           32           64           32            6            8            1            6            4            4         1360         1024            1    
      5           40           34           40           32           32           32           16           16           16            6            8            1            3            4            4         1360         1024            1    
      6           40           34           40           32           32           32           64          128           64            6            8            1           11            2            2         1360         1024            1    
      7           40           34           40           32           32           32           32           32           32            6            8            1            6            2            2         1360         1024            1    
      9           34           10           34           32            8           32          128          256          128           32            8            1            4            1            4          340          256            1    
     10           34           10           34           32            8           32           64           64           64           32            8            1            2            1            4          340          256            1    
     11           18           10           18           16            8           16          256          512          256           16           32            1           16            1            2          180          128            1    
     12           18           10           18           16            8           16          128          128          128           16           32            1            8            1            2          180          128            1    
     17           32           32           32           32           32           32          128          256          128            7            8            1           19            2            2         1024         1024            1    
     18           16            8           16           16            8           16          512          256          512           32           32            1           16            1            2          128          128            1    
     19            8            8            8            8            8            8          512          256          512           32           32            1           16            1            1           64           64            1    
     20            4            4            4            4            4            4          512          256          512           32           32            1           16            1            1           16           16            1    
     21            2            2            2            2            2            2          512          256          512           32           32            1           16            1            1            4            4            1    
     22            1            1            1            1            1            1          512          256          512           32           32            1           16            1            1            1            1            1    
     23           40           18           40           32           16           32          256           16          256            8            8            1           32            2            4          720          512            1    
     25           40           18           40           32           16           32          256           16          256            8            8            1           32            2            4          720          512            1    
     27           18           10           18           16            8           16          256           24          256           16           24            1           16            1            2          180          128            1    
     29           18           10           18           16            8           16          256           18          256           16           18            1           16            1            2          180          128            1    
     31           10           10           10            8            8            8          256           24          256           16           24            1           16            1            1          100           64            1    
     33           10           10           10            8            8            8          256           18          256           16           18            1           16            1            1          100           64            1    
     35            6            6            6            4            4            4          256           24          256           16           24            1           16            1            1           36           16            1    
     37            6            6            6            4            4            4          256           18          256           16           18            1           16            1            1           36           16            1    
     39            4            4            4            2            2            2          256           16          256           16           16            1           16            1            1           16            4            1    
     41            4            4            4            2            2            2          256           12          256           16           12            1           16            1            1           16            4            1    
     43            3            3            3            1            1            1          256           16          256           16           16            1           16            1            1            9            1            1    
     45            3            3            3            1            1            1          256           12          256           16           12            1           16            1            1            9            1            1    

Processing Frame Number : 0 

 Layer    1 : Out Q :      254 , TIDL_BatchNormLayer  , PASSED  #MMACs =     0.79,     0.79, Sparsity :   0.00
 Layer    2 : Out Q :     7443 , TIDL_ConvolutionLayer, PASSED  #MMACs =   157.29,   110.10, Sparsity :  30.00
 Layer    3 : Out Q :     4377 , TIDL_ConvolutionLayer, PASSED  #MMACs =   150.99,    69.21, Sparsity :  54.17
 Layer    4 : Out Q :     7337 , TIDL_ConvolutionLayer, PASSED  #MMACs =   301.99,   131.07, Sparsity :  56.60
 Layer    5 : Out Q :     6603 , TIDL_ConvolutionLayer, PASSED  #MMACs =   150.99,    76.15, Sparsity :  49.57
 Layer    6 : Out Q :    10119 , TIDL_ConvolutionLayer, PASSED  #MMACs =   301.99,   127.98, Sparsity :  57.62
 Layer    7 : Out Q :    10798 , TIDL_ConvolutionLayer, PASSED  #MMACs =   150.99,    67.70, Sparsity :  55.16
 Layer    8 :TIDL_PoolingLayer,     PASSED  #MMACs =     0.13,     0.13, Sparsity :   0.00
 Layer    9 : Out Q :    13875 , TIDL_ConvolutionLayer, PASSED  #MMACs =   301.99,   301.99, Sparsity :   0.00
 Layer   10 : Out Q :    15857 , TIDL_ConvolutionLayer, PASSED  #MMACs =   150.99,   150.99, Sparsity :   0.00
 Layer   11 : Out Q :    16438 , TIDL_ConvolutionLayer, PASSED  #MMACs =   301.99,   301.99, Sparsity :   0.00
 Layer   12 : Out Q :     8158 , TIDL_ConvolutionLayer, PASSED  #MMACs =   150.99,   150.99, Sparsity :   0.00
 Layer   13 :TIDL_PoolingLayer,     PASSED  #MMACs =     0.03,     0.03, Sparsity :   0.00
 Layer   14 :TIDL_PoolingLayer,     PASSED  #MMACs =     0.01,     0.01, Sparsity :   0.00
 Layer   15 :TIDL_PoolingLayer,     PASSED  #MMACs =     0.00,     0.00, Sparsity :   0.00
 Layer   16 :TIDL_PoolingLayer,     PASSED  #MMACs =     0.00,     0.00, Sparsity :   0.00
 Layer   17 : Out Q :    21009 , TIDL_ConvolutionLayer, PASSED  #MMACs =   134.22,   154.21, Sparsity : -14.89
 Layer   18 : Out Q :    20622 , TIDL_ConvolutionLayer, PASSED  #MMACs =    33.55,    33.55, Sparsity :   0.00
 Layer   19 : Out Q :    11750 , TIDL_ConvolutionLayer, PASSED  #MMACs =     8.39,     8.39, Sparsity :   0.00
 Layer   20 : Out Q :    14449 , TIDL_ConvolutionLayer, PASSED  #MMACs =     2.10,     2.10, Sparsity :   0.00
 Layer   21 : Out Q :    13614 , TIDL_ConvolutionLayer, PASSED  #MMACs =     0.52,     0.52, Sparsity :   0.00
 Layer   22 : Out Q :    19473 , TIDL_ConvolutionLayer, PASSED  #MMACs =     0.13,     0.13, Sparsity :   0.00
 Layer   23 : Out Q :     2110 , TIDL_ConvolutionLayer, PASSED  #MMACs =   150.99,    92.73, Sparsity :  38.59
 Layer   24 :TIDL_FlattenLayer, PASSED  #MMACs =     0.07,     0.07, Sparsity :   0.00
 Layer   25 : Out Q :     4680 , TIDL_ConvolutionLayer, PASSED  #MMACs =   150.99,   108.30, Sparsity :  28.28
 Layer   26 :TIDL_FlattenLayer, PASSED  #MMACs =     0.05,     0.05, Sparsity :   0.00
 Layer   27 : Out Q :     9099 , TIDL_ConvolutionLayer, PASSED  #MMACs =    14.16,    14.16, Sparsity :   0.00
 Layer   28 :TIDL_FlattenLayer, PASSED  #MMACs =     0.01,     0.01, Sparsity :   0.00
 Layer   29 : Out Q :     6283 , TIDL_ConvolutionLayer, PASSED  #MMACs =    10.62,    10.62, Sparsity :   0.00
 Layer   30 :TIDL_FlattenLayer, PASSED  #MMACs =     0.00,     0.00, Sparsity :   0.00
 Layer   31 : Out Q :     8664 , TIDL_ConvolutionLayer, PASSED  #MMACs =     3.54,     3.54, Sparsity :   0.00
 Layer   32 :TIDL_FlattenLayer, PASSED  #MMACs =     0.00,     0.00, Sparsity :   0.00
 Layer   33 : Out Q :     4219 , TIDL_ConvolutionLayer, PASSED  #MMACs =     2.65,     2.65, Sparsity :   0.00
 Layer   34 :TIDL_FlattenLayer, PASSED  #MMACs =     0.00,     0.00, Sparsity :   0.00
 Layer   35 : Out Q :     7931 , TIDL_ConvolutionLayer, PASSED  #MMACs =     0.88,     0.88, Sparsity :   0.00
 Layer   36 :TIDL_FlattenLayer, PASSED  #MMACs =     0.00,     0.00, Sparsity :   0.00
 Layer   37 : Out Q :     5814 , TIDL_ConvolutionLayer, PASSED  #MMACs =     0.66,     0.66, Sparsity :   0.00
 Layer   38 :TIDL_FlattenLayer, PASSED  #MMACs =     0.00,     0.00, Sparsity :   0.00
 Layer   39 : Out Q :     9744 , TIDL_ConvolutionLayer, PASSED  #MMACs =     0.15,     0.15, Sparsity :   0.00
 Layer   40 :TIDL_FlattenLayer, PASSED  #MMACs =     0.00,     0.00, Sparsity :   0.00
 Layer   41 : Out Q :     7674 , TIDL_ConvolutionLayer, PASSED  #MMACs =     0.11,     0.11, Sparsity :   0.00
 Layer   42 :TIDL_FlattenLayer, PASSED  #MMACs =     0.00,     0.00, Sparsity :   0.00
 Layer   43 : Out Q :    13062 , TIDL_ConvolutionLayer, PASSED  #MMACs =     0.04,     0.04, Sparsity :   0.00
 Layer   44 :TIDL_FlattenLayer, PASSED  #MMACs =     0.00,     0.00, Sparsity :   0.00
 Layer   45 : Out Q :     8806 , TIDL_ConvolutionLayer, PASSED  #MMACs =     0.03,     0.03, Sparsity :   0.00
 Layer   46 :TIDL_FlattenLayer, PASSED  #MMACs =     0.00,     0.00, Sparsity :   0.00
 Layer   47 : Out Q :     2118 , TIDL_ConcatLayer, PASSED  #MMACs =     0.00,     0.00, Sparsity :    nan
 Layer   48 : Out Q :     4236 , TIDL_ConcatLayer, PASSED  #MMACs =     0.00,     0.00, Sparsity :    nan
 Layer   49 :
Target: number label value xmin  ymin  xmax  ymax
Target:  0.00  2.00  1.00  0.71  0.36  1.01  0.75
Target:  1.00  2.00  0.51  0.39  0.41  0.44  0.46
Target:  2.00  1.00  0.48  0.69  0.28  0.73  0.38
Target:  3.00  1.00  0.43  0.10  0.42  0.12  0.49
Target:  4.00  2.00  0.41  0.36  0.41  0.38  0.44
Target:  5.00  1.00  0.28  0.72  0.40  0.74  0.47
Target:  6.00  2.00  0.27  0.42  0.40  0.44  0.42
Target:  7.00  1.00  0.24  0.74  0.40  0.76  0.45
Target:  8.00  1.00  0.22  0.45  0.38  0.48  0.45
Target:  9.00  2.00  0.22  0.42  0.40  0.43  0.42
Target: 10.00  2.00  0.20  0.37  0.41  0.39  0.44
Target: 11.00  1.00  0.20  0.46  0.38  0.49  0.45
Target: 12.00  2.00  0.20  0.33  0.40  0.35  0.43
Target: 13.00  2.00  0.19  0.16  0.00  0.28  0.10
Target: 14.00  2.00  0.19  0.38  0.66  0.48  0.76
Target: 15.00  2.00  0.19  0.37  0.41  0.39  0.42
Target: 16.00  2.00  0.18  0.34  0.41  0.37  0.44
Target: 17.00  1.00  0.18  0.58  0.41  0.59  0.45
Target: 18.00  2.00  0.17  0.38  0.66  0.45  0.72
Target: 19.00  2.00  0.17  0.38  0.41  0.39  0.42
Target: 20.00  1.00  0.16  0.80  0.52  0.86  0.67
Target: 21.00  2.00  0.16  0.27  0.42  0.34  0.47
Target: 22.00  2.00  0.16  0.31  0.42  0.34  0.45
Target: 23.00  1.00  0.16  0.50  0.39  0.52  0.44
Target: 24.00  1.00  0.16  0.49  0.38  0.51  0.44
Target: 25.00  1.00  0.16  0.58  0.42  0.59  0.45
Target: 26.00  2.00  0.15  0.40  0.40  0.41  0.42
Target: 27.00  2.00  0.15  0.92  0.40  1.01  0.52
Target: 28.00  2.00  0.15  0.66  0.47  0.77  0.57
Target: 29.00  2.00  0.15  0.40  0.66  0.49  0.71
Target: 30.00  2.00  0.15  0.54  0.45  0.62  0.49
Target: 31.00  1.00  0.15  0.54  0.42  0.55  0.45
Target: 32.00  1.00  0.15  0.72  0.28  0.74  0.35
Target: 33.00  1.00  0.15  0.56  0.42  0.57  0.45
Target: 34.00  1.00  0.15  0.81  0.60  0.85  0.72
Target: 35.00  2.00  0.15  0.40  0.41  0.42  0.43
Target: 36.00  2.00  0.15  0.42  0.40  0.43  0.41
Target: 37.00  2.00  0.15  0.40  0.40  0.41  0.41
Target: 38.00  2.00  0.14  0.92  0.62  0.99  0.67
Target: 39.00  2.00  0.14  0.01  0.32  0.07  0.38
Target: 40.00  2.00  0.14  0.57  0.23  0.59  0.24
Target: 41.00  2.00  0.14  0.81  0.37  0.92  0.42
Target: 42.00  1.00  0.14  0.75  0.23  0.80  0.35
Target: 43.00  1.00  0.14  0.52  0.41  0.54  0.45
Target: 44.00  2.00  0.14  0.49  0.43  0.54  0.47
Target: 45.00  2.00  0.14  0.42  0.40  0.43  0.42
Target: 46.00  2.00  0.14  0.05  0.33  0.10  0.38
Target: 47.00  2.00  0.14  0.43  0.40  0.43  0.41
Target: 48.00  2.00  0.14  0.39  0.41  0.41  0.44
Target: 49.00  2.00  0.13  0.79  0.36  1.00  0.45
Target: 50.00  1.00  0.13  0.73  0.40  0.76  0.48
Target: 51.00  2.00  0.13  0.89  0.53  1.00  0.65
Target: 52.00  1.00  0.13  0.72  0.41  0.74  0.44
Target: 53.00  2.00  0.13  0.15  0.00  0.25  0.05
Target: 54.00  2.00  0.13  0.92  0.43  1.00  0.49
Target: 55.00  2.00  0.13 -0.00  0.36  0.10  0.48
Target: 56.00  2.00  0.13  0.73  0.60  0.85  0.71
Target: 57.00  2.00  0.13  0.36  0.41  0.38  0.43
Target: 58.00  1.00  0.13  0.52  0.41  0.53  0.44
Target: 59.00  2.00  0.13  0.01  0.47  0.09  0.51
Target: 60.00  2.00  0.13  0.50  0.47  0.59  0.51
Target: 61.00  2.00  0.13  0.74  0.23  0.82  0.29
Target: 62.00  2.00  0.13  0.63  0.51  0.70  0.56
Target: 63.00  2.00  0.13  0.92  0.39  1.00  0.44
Target: 64.00  2.00  0.13  0.10  0.33  0.17  0.38
Target: 65.00  2.00  0.13  0.84  0.36  0.96  0.41
Target: 66.00  2.00  0.13  0.83  0.37  0.90  0.40
Target: 67.00  2.00  0.13  0.22  0.34  0.29  0.38
Target: 68.00  1.00  0.13  0.58  0.39  0.60  0.44
Target: 69.00  2.00  0.13  0.68  0.45  0.72  0.48
Target: 70.00  1.00  0.13  0.44  0.40  0.46  0.45
Target: 71.00  2.00  0.13  0.39  0.40  0.40  0.42
Target: 72.00  1.00  0.13  0.53  0.42  0.55  0.45
Target: 73.00  2.00  0.13  0.84  0.17  0.93  0.33
Target: 74.00  2.00  0.13  0.89  0.43  1.00  0.57
Target: 75.00  2.00  0.13  0.38  0.66  0.43  0.69
Target: 76.00  2.00  0.13  0.80  0.93  0.76  0.83
Target: 77.00  2.00  0.13  0.40  0.50  0.47  0.54
Target: 78.00  2.00  0.13  0.43  0.67  0.52  0.78
Target: 79.00  2.00  0.13  0.66  0.71  0.78  0.82
Target: 80.00  2.00  0.13  0.53  0.67  0.63  0.81
Target: 81.00  2.00  0.13  0.94  0.09  1.00  0.18
Target: 82.00  1.00  0.13  0.75  0.41  0.76  0.44
Target: 83.00  2.00  0.13  0.45  0.66  0.51  0.74
Target: 84.00  2.00  0.13  0.90  0.60  1.00  0.70
Target: 85.00  2.00  0.13  0.11 -0.00  0.22  0.10
Target: 86.00  2.00  0.13  0.43 -0.00  0.51  0.03
Target: 87.00  2.00  0.13  0.82  0.17  0.90  0.30
Target: 88.00  2.00  0.13  0.37  0.50  0.44  0.53
Target: 89.00  2.00  0.13  0.41  0.48  0.49  0.52
Target: 90.00  2.00  0.13  0.41  0.50  0.48  0.52
Target: 91.00  2.00  0.13  0.93  0.78  0.83  0.83
Target: 92.00  2.00  0.13  0.41  0.65  0.50  0.69
Target: 93.00  2.00  0.13  0.16  0.47  0.24  0.51
Target: 94.00  2.00  0.13  0.20  0.46  0.26  0.49
Target: 95.00  2.00  0.13  0.25  0.43  0.33  0.48
Target: 96.00  2.00  0.13  0.77  0.27  0.86  0.39
Target: 97.00  2.00  0.13  0.85  0.22  1.00  0.34
Target: 98.00  2.00  0.13  0.61  0.66  0.71  0.78
Target: 99.00  2.00  0.13  0.55  0.46  0.68  0.54
Target: 100.00  2.00  0.13  0.62  0.51  0.73  0.62
Target: 101.00  2.00  0.13  0.41  0.50  0.43  0.51
Target: 102.00  2.00  0.13  0.33  0.41  0.35  0.44
Target: 103.00  2.00  0.13  0.92  0.00  1.00  0.12
Target: 104.00  1.00  0.13  0.54  0.41  0.55  0.44
Target: 105.00  2.00  0.13  0.58  0.22  0.60  0.23
Target: 106.00  2.00  0.13  0.19  0.46  0.23  0.49
Target: 107.00  2.00  0.13  0.35  0.41  0.37  0.44
Target: 108.00  2.00  0.13  0.86 -0.00  0.96  0.05
Target: 109.00  2.00  0.13  0.82  0.00  0.89  0.04
Target: 110.00  2.00  0.13  0.87 -0.00  1.00  0.08
Target: 111.00  2.00  0.13  0.72  0.60  0.80  0.66
Target: 112.00  2.00  0.13  0.23  0.64  0.33  0.77
Target: 113.00  2.00  0.13  0.12  0.47  0.21  0.51
Target: 114.00  2.00  0.13  0.89  0.37  1.00  0.42
Target: 115.00  2.00  0.13  0.75  0.21  0.85  0.33
Target: 116.00  2.00  0.13  0.81  0.55  0.91  0.62
Target: 117.00  2.00  0.13  0.63  0.48  0.73  0.54
Target: 118.00  2.00  0.13  0.47  0.48  0.56  0.52
Target: 119.00  2.00  0.13  0.36  0.48  0.43  0.50
Target: 120.00  2.00  0.13  0.16  0.46  0.22  0.49
Target: 121.00  2.00  0.13  0.62  0.31  0.69  0.36
Target: 122.00  1.00  0.13  0.63  0.41  0.64  0.45
Target: 123.00  2.00  0.13  0.80  0.38  0.87  0.42
Target: 124.00  2.00  0.13  0.18  0.10  0.26  0.14
Target: 125.00  1.00  0.13  0.63  0.42  0.65  0.46
Target: 126.00  1.00  0.13  0.57  0.39  0.59  0.44
Target: 127.00  2.00  0.13  0.37  0.69  0.45  0.75
Target: 128.00  2.00  0.13  0.55  0.44  0.62  0.48
Target: 129.00  2.00  0.13  0.79  0.38  0.89  0.44
Target: 130.00  2.00  0.13  0.24  0.42  0.36  0.51
Target: 131.00  1.00  0.13  0.53  0.42  0.54  0.44
Target: 132.00  2.00  0.13 -0.01  0.41  0.04  0.46
Target: 133.00  2.00  0.13  0.61  0.49  0.69  0.54
Target: 134.00  2.00  0.13  0.91  0.56  0.99  0.61
Target: 135.00  2.00  0.13  0.19  0.34  0.30  0.40
Target: 136.00  2.00  0.13  0.00  0.33  0.05  0.37
Target: 137.00  2.00  0.13  0.69  0.53  0.78  0.60
Target: 138.00  2.00  0.13  0.93  0.40  0.99  0.45
Target: 139.00  2.00  0.13  0.72  0.56  0.82  0.67
Target: 140.00  2.00  0.13  0.13  0.33  0.18  0.37
Target: 141.00  2.00  0.13  0.42  0.40  0.43  0.41
Target: 142.00  2.00  0.13  0.95  0.45  1.01  0.50
Target: 143.00  2.00  0.13  0.50  0.44  0.56  0.47
Target: 144.00  2.00  0.13  0.33  0.42  0.36  0.45
Target: 145.00  2.00  0.13  0.72  0.31  0.80  0.36
Target: 146.00  2.00  0.13  0.81  0.64  0.89  0.69
Target: 147.00  2.00  0.13  0.63  0.54  0.72  0.60
Target: 148.00  1.00  0.13  0.55  0.42  0.56  0.45
Target: 149.00  2.00  0.13  0.16  0.35  0.22  0.39
Target: 150.00  2.00  0.13  0.21  0.45  0.25  0.47
Target: 151.00  2.00  0.13  0.26  0.43  0.30  0.46
Target: 152.00  2.00  0.13 -0.01  0.30  0.04  0.34
Target: 153.00  2.00  0.13  0.06  0.32  0.13  0.38
Target: 154.00  2.00  0.12  0.27  0.43  0.31  0.46
Target: 155.00  2.00  0.12  0.41  0.41  0.42  0.42
Target: 156.00  2.00  0.12  0.38  0.40  0.39  0.42
Target: 157.00  2.00  0.12  0.18  0.05  0.27  0.16
Target: 158.00  2.00  0.12  0.38  0.41  0.39  0.43
Target: 159.00  2.00  0.12  0.25  0.42  0.30  0.46
Target: 160.00  2.00  0.12  0.89  0.67  1.00  0.79
Target: 161.00  2.00  0.12  0.00  0.52  0.06  0.60
Target: 162.00  2.00  0.12  0.05  0.96  0.01  0.87
Target: 163.00  2.00  0.12  0.54 -0.00  0.63  0.04
Target: 164.00  2.00  0.12  0.00  0.66  0.12  0.76
Target: 165.00  2.00  0.12 -0.01  0.57  0.06  0.65
Target: 166.00  2.00  0.12  0.83  0.22  0.88  0.30
Target: 167.00  2.00  0.12  0.18  0.37  0.37  0.45
Target: 168.00  2.00  0.12  0.79  0.46  0.90  0.52
Target: 169.00  2.00  0.12  0.78  0.44  0.91  0.54
Target: 170.00  2.00  0.12  0.44  0.50  0.46  0.52
Target: 171.00  2.00  0.12  0.15  0.02  0.23  0.08
Target: 172.00  1.00  0.12  0.75  0.40  0.76  0.42
Target: 173.00  1.00  0.12  0.61  0.41  0.63  0.45
Target: 174.00  1.00  0.12  0.80  0.29  0.83  0.40
Target: 175.00  1.00  0.12  0.59  0.42  0.60  0.45
Target: 176.00  1.00  0.12  0.62  0.42  0.63  0.46
Target: 177.00  1.00  0.11  0.38  0.41  0.39  0.45
Target: 178.00  1.00  0.11  0.74  0.41  0.75  0.43
Target: 179.00  1.00  0.11  0.56  0.41  0.57  0.44
Target: 180.00  1.00  0.11  0.58  0.37  0.60  0.43
Target: 181.00  1.00  0.11  0.73  0.41  0.78  0.50
Target: 182.00  1.00  0.11  0.10  0.42  0.12  0.46
Target: 183.00  1.00  0.11  0.57  0.42  0.58  0.45
Target: 184.00  1.00  0.11  0.68  0.45  0.70  0.48
Target: 185.00  1.00  0.11  0.78  0.06  0.83  0.22
Target: 186.00  1.00  0.11  0.74  0.40  0.76  0.43
Target: 187.00  1.00  0.11  0.73  0.40  0.74  0.42
Target: 188.00  1.00  0.11  0.09  0.43  0.11  0.47
Target: 189.00  1.00  0.11  0.70  0.49  0.75  0.66
Target: 190.00  1.00  0.11  0.80  0.03  0.84  0.19
Target: 191.00  1.00  0.11  0.72  0.43  0.74  0.47
Target: 192.00  1.00  0.11  0.75  0.40  0.76  0.43
Target: 193.00  1.00  0.11  0.81  0.30  0.83  0.40
Target: 194.00  1.00  0.11  0.75  0.40  0.79  0.47
Target: 195.00  1.00  0.11  0.50  0.37  0.52  0.41
Target: 196.00  1.00  0.11  0.65  0.42  0.66  0.46
Target: 197.00  1.00  0.11  0.73  0.26  0.76  0.35
Target: 198.00  1.00  0.11  0.74  0.24  0.78  0.35
Target: 199.00  1.00  0.10  0.72  0.41  0.73  0.44
 #MMACs =     0.01,     0.01, Sparsity :   0.00
End of config list found !
# Default - 0
randParams         = 0 

# 0: Caffe, 1: TensorFlow, Default - 0
modelType          = 0 

# 0: Fixed quantization By tarininng Framework, 1: Dyanamic quantization by TIDL, Default - 1
quantizationStyle  = 1 

# quantRoundAdd/100 will be added while rounding to integer, Default - 50
quantRoundAdd      = 25

numParamBits       = 8
# 0 : 8bit Unsigned, 1 : 8bit Signed Default - 1
inElementType      = 0 

inputNetFile      = "deploy.prototxt"

inputParamsFile    = "weights.caffemodel"

outputNetFile      = "./out/tidl_net_jdetNet_ssd_512x512.bin"
outputParamsFile   = "./out/tidl_param_jdetNet_ssd_512x512.bin"

rawSampleInData = 0
preProcType   = 4
sampleInData = "./frame.png"
tidlStatsTool = "eve_test_dl_algo_ref.out"
layersGroupId = 0	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	2	0


=============================== TIDL import - parsing ===============================

Caffe Network File : deploy.prototxt  
Caffe Model File   : weights.caffemodel  
TIDL Network File  : ./out/tidl_net_jdetNet_ssd_512x512.bin  
TIDL Model File    : ./out/tidl_param_jdetNet_ssd_512x512.bin  
Name of the Network : ssdJacintoNetV2_deploy 
Num Inputs :               1 

Error in DetectionOutput layer: could not find parameters for detection_out!
 Num of Layer Detected :  57 
  0, TIDL_DataLayer                , data                                      0,  -1 ,  1 ,   x ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  0 ,       0 ,       0 ,       0 ,       0 ,       1 ,       3 ,     512 ,     512 ,         0 ,
  1, TIDL_BatchNormLayer           , data/bias                                 1,   1 ,  1 ,   0 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  1 ,       1 ,       3 ,     512 ,     512 ,       1 ,       3 ,     512 ,     512 ,    786432 ,
  2, TIDL_ConvolutionLayer         , conv1a                                    1,   1 ,  1 ,   1 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  2 ,       1 ,       3 ,     512 ,     512 ,       1 ,      32 ,     256 ,     256 , 157286400 ,
  3, TIDL_ConvolutionLayer         , conv1b                                    1,   1 ,  1 ,   2 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  3 ,       1 ,      32 ,     256 ,     256 ,       1 ,      32 ,     128 ,     128 , 150994944 ,
  4, TIDL_ConvolutionLayer         , res2a_branch2a                            1,   1 ,  1 ,   3 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  4 ,       1 ,      32 ,     128 ,     128 ,       1 ,      64 ,     128 ,     128 , 301989888 ,
  5, TIDL_ConvolutionLayer         , res2a_branch2b                            1,   1 ,  1 ,   4 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  5 ,       1 ,      64 ,     128 ,     128 ,       1 ,      64 ,      64 ,      64 , 150994944 ,
  6, TIDL_ConvolutionLayer         , res3a_branch2a                            1,   1 ,  1 ,   5 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  6 ,       1 ,      64 ,      64 ,      64 ,       1 ,     128 ,      64 ,      64 , 301989888 ,
  7, TIDL_ConvolutionLayer         , res3a_branch2b                            1,   1 ,  1 ,   6 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  7 ,       1 ,     128 ,      64 ,      64 ,       1 ,     128 ,      64 ,      64 , 150994944 ,
  8, TIDL_PoolingLayer             , pool3                                     1,   1 ,  1 ,   7 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  8 ,       1 ,     128 ,      64 ,      64 ,       1 ,     128 ,      32 ,      32 ,    524288 ,
  9, TIDL_ConvolutionLayer         , res4a_branch2a                            1,   1 ,  1 ,   8 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  9 ,       1 ,     128 ,      32 ,      32 ,       1 ,     256 ,      32 ,      32 , 301989888 ,
 10, TIDL_ConvolutionLayer         , res4a_branch2b                            1,   1 ,  1 ,   9 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 10 ,       1 ,     256 ,      32 ,      32 ,       1 ,     256 ,      16 ,      16 , 150994944 ,
 11, TIDL_ConvolutionLayer         , res5a_branch2a                            1,   1 ,  1 ,  10 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 11 ,       1 ,     256 ,      16 ,      16 ,       1 ,     512 ,      16 ,      16 , 301989888 ,
 12, TIDL_ConvolutionLayer         , res5a_branch2b                            1,   1 ,  1 ,  11 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 12 ,       1 ,     512 ,      16 ,      16 ,       1 ,     512 ,      16 ,      16 , 150994944 ,
 13, TIDL_PoolingLayer             , pool6                                     1,   1 ,  1 ,  12 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 13 ,       1 ,     512 ,      16 ,      16 ,       1 ,     512 ,       8 ,       8 ,    131072 ,
 14, TIDL_PoolingLayer             , pool7                                     1,   1 ,  1 ,  13 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 14 ,       1 ,     512 ,       8 ,       8 ,       1 ,     512 ,       4 ,       4 ,     32768 ,
 15, TIDL_PoolingLayer             , pool8                                     1,   1 ,  1 ,  14 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 15 ,       1 ,     512 ,       4 ,       4 ,       1 ,     512 ,       2 ,       2 ,      8192 ,
 16, TIDL_PoolingLayer             , pool9                                     1,   1 ,  1 ,  15 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 16 ,       1 ,     512 ,       2 ,       2 ,       1 ,     512 ,       1 ,       1 ,      2048 ,
 17, TIDL_ConvolutionLayer         , ctx_output1                               1,   1 ,  1 ,   7 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 17 ,       1 ,     128 ,      64 ,      64 ,       1 ,     256 ,      64 ,      64 , 134217728 ,
 18, TIDL_ConvolutionLayer         , ctx_output2                               1,   1 ,  1 ,  12 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 18 ,       1 ,     512 ,      16 ,      16 ,       1 ,     256 ,      16 ,      16 ,  33554432 ,
 19, TIDL_ConvolutionLayer         , ctx_output3                               1,   1 ,  1 ,  13 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 19 ,       1 ,     512 ,       8 ,       8 ,       1 ,     256 ,       8 ,       8 ,   8388608 ,
 20, TIDL_ConvolutionLayer         , ctx_output4                               1,   1 ,  1 ,  14 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 20 ,       1 ,     512 ,       4 ,       4 ,       1 ,     256 ,       4 ,       4 ,   2097152 ,
 21, TIDL_ConvolutionLayer         , ctx_output5                               1,   1 ,  1 ,  15 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 21 ,       1 ,     512 ,       2 ,       2 ,       1 ,     256 ,       2 ,       2 ,    524288 ,
 22, TIDL_ConvolutionLayer         , ctx_output6                               1,   1 ,  1 ,  16 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 22 ,       1 ,     512 ,       1 ,       1 ,       1 ,     256 ,       1 ,       1 ,    131072 ,
 23, TIDL_ConvolutionLayer         , ctx_output1/relu_mbox_loc                 1,   1 ,  1 ,  17 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 23 ,       1 ,     256 ,      64 ,      64 ,       1 ,      16 ,      64 ,      64 , 150994944 ,
 24, TIDL_FlattenLayer             , ctx_output1/relu_mbox_loc_perm            1,   1 ,  1 ,  23 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 24 ,       1 ,      16 ,      64 ,      64 ,       1 ,       1 ,       1 ,   65536 ,         1 ,
 25, TIDL_ConvolutionLayer         , ctx_output1/relu_mbox_conf                1,   1 ,  1 ,  17 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 25 ,       1 ,     256 ,      64 ,      64 ,       1 ,      84 ,      64 ,      64 , 792723456 ,
 26, TIDL_FlattenLayer             , ctx_output1/relu_mbox_conf_perm           1,   1 ,  1 ,  25 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 26 ,       1 ,      84 ,      64 ,      64 ,       1 ,       1 ,       1 ,  344064 ,         1 ,
 28, TIDL_ConvolutionLayer         , ctx_output2/relu_mbox_loc                 1,   1 ,  1 ,  18 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 28 ,       1 ,     256 ,      16 ,      16 ,       1 ,      24 ,      16 ,      16 ,  14155776 ,
 29, TIDL_FlattenLayer             , ctx_output2/relu_mbox_loc_perm            1,   1 ,  1 ,  28 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 29 ,       1 ,      24 ,      16 ,      16 ,       1 ,       1 ,       1 ,    6144 ,         1 ,
 30, TIDL_ConvolutionLayer         , ctx_output2/relu_mbox_conf                1,   1 ,  1 ,  18 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 30 ,       1 ,     256 ,      16 ,      16 ,       1 ,     126 ,      16 ,      16 ,  74317824 ,
 31, TIDL_FlattenLayer             , ctx_output2/relu_mbox_conf_perm           1,   1 ,  1 ,  30 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 31 ,       1 ,     126 ,      16 ,      16 ,       1 ,       1 ,       1 ,   32256 ,         1 ,
 33, TIDL_ConvolutionLayer         , ctx_output3/relu_mbox_loc                 1,   1 ,  1 ,  19 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 33 ,       1 ,     256 ,       8 ,       8 ,       1 ,      24 ,       8 ,       8 ,   3538944 ,
 34, TIDL_FlattenLayer             , ctx_output3/relu_mbox_loc_perm            1,   1 ,  1 ,  33 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 34 ,       1 ,      24 ,       8 ,       8 ,       1 ,       1 ,       1 ,    1536 ,         1 ,
 35, TIDL_ConvolutionLayer         , ctx_output3/relu_mbox_conf                1,   1 ,  1
Processing config file ./tempDir/qunat_stats_config.txt !

Running TIDL simulation for calibration. 

  0, TIDL_DataLayer                ,  0,  -1 ,  1 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  0 ,    0 ,    0 ,    0 ,    0 ,    1 ,    3 ,  512 ,  512 ,
  1, TIDL_BatchNormLayer           ,  1,   1 ,  1 ,  0 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  1 ,    1 ,    3 ,  512 ,  512 ,    1 ,    3 ,  512 ,  512 ,
  2, TIDL_ConvolutionLayer         ,  1,   1 ,  1 ,  1 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  2 ,    1 ,    3 ,  512 ,  512 ,    1 ,   32 ,  256 ,  256 ,
  3, TIDL_ConvolutionLayer         ,  1,   1 ,  1 ,  2 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  3 ,    1 ,   32 ,  256 ,  256 ,    1 ,   32 ,  128 ,  128 ,
  4, TIDL_ConvolutionLayer         ,  1,   1 ,  1 ,  3 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  4 ,    1 ,   32 ,  128 ,  128 ,    1 ,   64 ,  128 ,  128 ,
  5, TIDL_ConvolutionLayer         ,  1,   1 ,  1 ,  4 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  5 ,    1 ,   64 ,  128 ,  128 ,    1 ,   64 ,   64 ,   64 ,
  6, TIDL_ConvolutionLayer         ,  1,   1 ,  1 ,  5 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  6 ,    1 ,   64 ,   64 ,   64 ,    1 ,  128 ,   64 ,   64 ,
  7, TIDL_ConvolutionLayer         ,  1,   1 ,  1 ,  6 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  7 ,    1 ,  128 ,   64 ,   64 ,    1 ,  128 ,   64 ,   64 ,
  8, TIDL_PoolingLayer             ,  1,   1 ,  1 ,  7 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  8 ,    1 ,  128 ,   64 ,   64 ,    1 ,  128 ,   32 ,   32 ,
  9, TIDL_ConvolutionLayer         ,  1,   1 ,  1 ,  8 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  9 ,    1 ,  128 ,   32 ,   32 ,    1 ,  256 ,   32 ,   32 ,
 10, TIDL_ConvolutionLayer         ,  1,   1 ,  1 ,  9 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 10 ,    1 ,  256 ,   32 ,   32 ,    1 ,  256 ,   16 ,   16 ,
 11, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 10 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 11 ,    1 ,  256 ,   16 ,   16 ,    1 ,  512 ,   16 ,   16 ,
 12, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 11 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 12 ,    1 ,  512 ,   16 ,   16 ,    1 ,  512 ,   16 ,   16 ,
 13, TIDL_PoolingLayer             ,  1,   1 ,  1 , 12 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 13 ,    1 ,  512 ,   16 ,   16 ,    1 ,  512 ,    8 ,    8 ,
 14, TIDL_PoolingLayer             ,  1,   1 ,  1 , 13 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 14 ,    1 ,  512 ,    8 ,    8 ,    1 ,  512 ,    4 ,    4 ,
 15, TIDL_PoolingLayer             ,  1,   1 ,  1 , 14 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 15 ,    1 ,  512 ,    4 ,    4 ,    1 ,  512 ,    2 ,    2 ,
 16, TIDL_PoolingLayer             ,  1,   1 ,  1 , 15 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 16 ,    1 ,  512 ,    2 ,    2 ,    1 ,  512 ,    1 ,    1 ,
 17, TIDL_ConvolutionLayer         ,  1,   1 ,  1 ,  7 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 17 ,    1 ,  128 ,   64 ,   64 ,    1 ,  256 ,   64 ,   64 ,
 18, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 12 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 18 ,    1 ,  512 ,   16 ,   16 ,    1 ,  256 ,   16 ,   16 ,
 19, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 13 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 19 ,    1 ,  512 ,    8 ,    8 ,    1 ,  256 ,    8 ,    8 ,
 20, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 14 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 20 ,    1 ,  512 ,    4 ,    4 ,    1 ,  256 ,    4 ,    4 ,
 21, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 15 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 21 ,    1 ,  512 ,    2 ,    2 ,    1 ,  256 ,    2 ,    2 ,
 22, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 16 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 22 ,    1 ,  512 ,    1 ,    1 ,    1 ,  256 ,    1 ,    1 ,
 23, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 17 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 23 ,    1 ,  256 ,   64 ,   64 ,    1 ,   16 ,   64 ,   64 ,
 24, TIDL_FlattenLayer             ,  1,   1 ,  1 , 23 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 24 ,    1 ,   16 ,   64 ,   64 ,    1 ,    1 ,    1 ,65536 ,
 25, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 17 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 25 ,    1 ,  256 ,   64 ,   64 ,    1 ,   84 ,   64 ,   64 ,
 26, TIDL_FlattenLayer             ,  1,   1 ,  1 , 25 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 26 ,    1 ,   84 ,   64 ,   64 ,    1 ,    1 ,    1 ,344064 ,
 27, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 18 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 28 ,    1 ,  256 ,   16 ,   16 ,    1 ,   24 ,   16 ,   16 ,
 28, TIDL_FlattenLayer             ,  1,   1 ,  1 , 28 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 29 ,    1 ,   24 ,   16 ,   16 ,    1 ,    1 ,    1 , 6144 ,
 29, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 18 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 30 ,    1 ,  256 ,   16 ,   16 ,    1 ,  126 ,   16 ,   16 ,
 30, TIDL_FlattenLayer             ,  1,   1 ,  1 , 30 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 31 ,    1 ,  126 ,   16 ,   16 ,    1 ,    1 ,    1 ,32256 ,
 31, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 19 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 33 ,    1 ,  256 ,    8 ,    8 ,    1 ,   24 ,    8 ,    8 ,
 32, TIDL_FlattenLayer             ,  1,   1 ,  1 , 33 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 34 ,    1 ,   24 ,    8 ,    8 ,    1 ,    1 ,    1 , 1536 ,
 33, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 19 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 35 ,    1 ,  256 ,    8 ,    8 ,    1 ,  126 ,    8 ,    8 ,
 34, TIDL_FlattenLayer             ,  1,   1 ,  1 , 35 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 36 ,    1 ,  126 ,    8 ,    8 ,    1 ,    1 ,    1 , 8064 ,
 35, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 20 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 38 ,    1 ,  256 ,    4 ,    4 ,    1 ,   24 ,    4 ,    4 ,
 36, TIDL_FlattenLayer             ,  1,   1 ,  1 , 38 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 39 ,    1 ,   24 ,    4 ,    4 ,    1 ,    1 ,    1 ,  384 ,
 37, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 20 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 40 ,    1 ,  256 ,    4 ,    4 ,    1 ,  126 ,    4 ,    4 ,
 38, TIDL_FlattenLayer             ,  1,   1 ,  1 , 40 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 41 ,    1 ,  126 ,    4 ,    4 ,    1 ,    1 ,    1 , 2016 ,
 39, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 21 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 43 ,    1 ,  256 ,    2 ,    2 ,    1 ,   16 ,    2 ,    2 ,
 40, TIDL_FlattenLayer             ,  1,   1 ,  1 , 43 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 44 ,    1 ,   16 ,    2 ,    2 ,    1 ,    1 ,    1 ,   64 ,
 41, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 21 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 45 ,    1 ,  256 ,    2 ,    2 ,    1 ,   84 ,    2 ,    2 ,
 42, TIDL_FlattenLayer             ,  1,   1 ,  1 , 45 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 46 ,    1 ,   84 ,    2 ,    2 ,    1 ,    1 ,    1 ,  336 ,
 43, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 22 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 48 ,    1 ,  256 ,    1 ,    1 ,    1 ,   16 ,    1 ,    1 ,
 44, TIDL_FlattenLayer             ,  1,   1 ,  1 , 48 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 49 ,    1 ,   16 ,    1 ,    1 ,    1 ,    1 ,    1 ,   16 ,
 45, TIDL_ConvolutionLayer         ,  1,   1 ,  1 , 22 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 50 ,    1 ,  256 ,    1 ,    1 ,    1 ,   84 ,    1 ,    1 ,
 46, TIDL_FlattenLayer             ,  1,   1 ,  1 , 50 ,  x ,  x ,  x ,  x ,  x ,  x ,  x , 51 ,    1 ,   84 ,    1 ,    1 ,    1 ,    1 ,    1 ,   84 ,
 47, TIDL_ConcatLayer              ,  1,   6 ,  1 , 24 , 29 , 34 , 39 , 44 , 49 ,  x ,  x , 53 ,    1 ,    1 ,    1 ,65536 ,    1 ,    1 ,    1 ,73680 ,
 48, TIDL_ConcatLayer              ,  1,   6 ,  1 , 26 , 31 , 36 , 41 , 46 , 51 ,  x ,  x , 54 ,    1 ,    1 ,    1 ,344064 ,    1 ,    1 ,    1 ,386820 ,
 49, TIDL_DetectionOutputLayer     ,  1,   2 ,  1 , 53 , 54 ,  x ,  x ,  x ,  x ,  x ,  x , 56 ,    1 ,    1 ,    1 ,73680 ,    1 ,    1 ,    1 , 5600 ,
 50, TIDL_DataLayer                ,  0,   1 , -1 , 56 ,  x ,  x ,  x ,  x ,  x ,  x ,  x ,  0 ,    1 ,    1 ,    1 , 5600 ,    0 ,    0 ,    0 ,    0 ,
Layer ID    ,inBlkWidth  ,inBlkHeight ,inBlkPitch  ,outBlkWidth ,outBlkHeight,outBlkPitch ,numInChs    ,numOutChs   ,numProcInChs,numLclInChs ,numLclOutChs,numProcItrs ,numAccItrs  ,numHorBlock ,numVerBlock ,inBlkChPitch,outBlkChPitc,alignOrNot 
      2           72           72           72           32           32           32            3           32            3            1            8            1            3            8            8         5184         1024            1    
      3           40           34           40           32           32           32            8            8            8            4            8            1            2            8            8         1360         1024            1    
      4           40           34           40           32           32           32           32           64           32            6            8            1            6            4            4         1360         1024            1    
      5           40           34           40           32           32           32           16           16           16            6            8            1            3            4            4         1360         1024            1    
      6           40           34           40           32           32           32           64          128           64            6            8            1           11            2            2         1360         1024            1    
      7           40           34           40           32           32           32           32           32           32            6            8            1            6            2            2         1360         1024            1    
      9           34           10           34           32            8           32          128          256          128           32            8            1            4            1            4          340          256            1    
     10           34           10           34           32            8           32           64           64           64           32            8            1            2            1            4          340          256            1    
     11           18           10           18           16            8           16          256          512          256           16           32            1           16            1            2          180          128            1    
     12           18           10           18           16            8           16          128          128          128           16           32            1            8            1            2          180          128            1    
     17           32           32           32           32           32           32          128          256          128            7            8            1           19            2            2         1024         1024            1    
     18           16            8           16           16            8           16          512          256          512           32           32            1           16            1            2          128          128            1    
     19            8            8            8            8            8            8          512          256          512           32           32            1           16            1            1           64           64            1    
     20            4            4            4            4            4            4          512          256          512           32           32            1           16            1            1           16           16            1    
     21            2            2            2            2            2            2          512          256          512           32           32            1           16            1            1            4            4            1    
     22            1            1            1            1            1            1          512          256          512           32           32            1           16            1            1            1            1            1    
     23           40           18           40           32           16           32          256           16          256            8            8            1           32            2            4          720          512            1    
     25           40           18           40           32           16           32          256           88          256            8            8            1           32            2            4          720          512            1    
     27           18           10           18           16            8           16          256           24          256           16           24            1           16            1            2          180          128            1    
     29           18           10           18           16            8           16          256          128          256           16           32            1           16            1            2          180          128            1    
     31           10           10           10            8            8            8          256           24          256           16           24            1           16            1            1          100           64            1    
     33           10           10           10            8            8            8          256          128          256           16           32            1           16            1            1          100           64            1    
     35            6            6            6            4            4            4          256           24          256           16           24            1           16            1            1           36           16            1    
     37            6            6            6            4            4            4          256          128          256           16           32            1           16            1            1           36           16            1    
     39            4            4            4            2            2            2          256           16          256           16           16            1           16            1            1           16            4            1    
     41            4            4            4            2            2            2          256           96          256           16           32            1           16            1            1           16            4            1    
     43            3            3            3            1            1            1          256           16          256           16           16            1           16            1            1            9            1            1    
     45            3            3            3            1            1            1          256           96          256           16           32            1           16            1            1            9            1            1    

Processing Frame Number : 0 

 Layer    1 : Out Q :      254 , TIDL_BatchNormLayer  , PASSED  #MMACs =     0.79,     0.79, Sparsity :   0.00
 Layer    2 : Out Q :     4549 , TIDL_ConvolutionLayer, PASSED  #MMACs =   157.29,    98.83, Sparsity :  37.17
 Layer    3 : Out Q :     4847 , TIDL_ConvolutionLayer, PASSED  #MMACs =   150.99,    62.39, Sparsity :  58.68
 Layer    4 : Out Q :     9980 , TIDL_ConvolutionLayer, PASSED  #MMACs =   301.99,    92.47, Sparsity :  69.38
 Layer    5 : Out Q :    10715 , TIDL_ConvolutionLayer, PASSED  #MMACs =   150.99,    68.88, Sparsity :  54.38
 Layer    6 : Out Q :    13324 , TIDL_ConvolutionLayer, PASSED  #MMACs =   301.99,    95.81, Sparsity :  68.27
 Layer    7 : Out Q :    13881 , TIDL_ConvolutionLayer, PASSED  #MMACs =   150.99,    55.38, Sparsity :  63.32
 Layer    8 :TIDL_PoolingLayer,     PASSED  #MMACs =     0.13,     0.13, Sparsity :   0.00
 Layer    9 : Out Q :    18874 , TIDL_ConvolutionLayer, PASSED  #MMACs =   301.99,   301.99, Sparsity :   0.00
 Layer   10 : Out Q :    17236 , TIDL_ConvolutionLayer, PASSED  #MMACs =   150.99,   150.99, Sparsity :   0.00
 Layer   11 : Out Q :    22001 , TIDL_ConvolutionLayer, PASSED  #MMACs =   301.99,   301.99, Sparsity :   0.00
 Layer   12 : Out Q :    40120 , TIDL_ConvolutionLayer, PASSED  #MMACs =   150.99,   150.99, Sparsity :   0.00
 Layer   13 :TIDL_PoolingLayer,     PASSED  #MMACs =     0.03,     0.03, Sparsity :   0.00
 Layer   14 :TIDL_PoolingLayer,     PASSED  #MMACs =     0.01,     0.01, Sparsity :   0.00
 Layer   15 :TIDL_PoolingLayer,     PASSED  #MMACs =     0.00,     0.00, Sparsity :   0.00
 Layer   16 :TIDL_PoolingLayer,     PASSED  #MMACs =     0.00,     0.00, Sparsity :   0.00
 Layer   17 : Out Q :    24360 , TIDL_ConvolutionLayer, PASSED  #MMACs =   134.22,   151.63, Sparsity : -12.98
 Layer   18 : Out Q :    28099 , TIDL_ConvolutionLayer, PASSED  #MMACs =    33.55,    33.55, Sparsity :   0.00
 Layer   19 : Out Q :    15785 , TIDL_ConvolutionLayer, PASSED  #MMACs =     8.39,     8.39, Sparsity :   0.00
 Layer   20 : Out Q :    24262 , TIDL_ConvolutionLayer, PASSED  #MMACs =     2.10,     2.10, Sparsity :   0.00
 Layer   21 : Out Q :    25753 , TIDL_ConvolutionLayer, PASSED  #MMACs =     0.52,     0.52, Sparsity :   0.00
 Layer   22 : Out Q :    36535 , TIDL_ConvolutionLayer, PASSED  #MMACs =     0.13,     0.13, Sparsity :   0.00
 Layer   23 : Out Q :     3367 , TIDL_ConvolutionLayer, PASSED  #MMACs =   150.99,    91.31, Sparsity :  39.53
 Layer   24 :TIDL_FlattenLayer, PASSED  #MMACs =     0.07,     0.07, Sparsity :   0.00
 Layer   25 : Out Q :     4179 , TIDL_ConvolutionLayer, PASSED  #MMACs =   830.47,   213.45, Sparsity :  74.30
 Layer   26 :TIDL_FlattenLayer, PASSED  #MMACs =     0.34,     0.34, Sparsity :   0.00
 Layer   27 : Out Q :    10075 , TIDL_ConvolutionLayer, PASSED  #MMACs =    14.16,    14.16, Sparsity :   0.00
 Layer   28 :TIDL_FlattenLayer, PASSED  #MMACs =     0.01,     0.01, Sparsity :   0.00
 Layer   29 : Out Q :     3156 , TIDL_ConvolutionLayer, PASSED  #MMACs =    75.50,    75.50, Sparsity :   0.00
 Layer   30 :TIDL_FlattenLayer, PASSED  #MMACs =     0.03,     0.03, Sparsity :   0.00
 Layer   31 : Out Q :    11307 , TIDL_ConvolutionLayer, PASSED  #MMACs =     3.54,     3.54, Sparsity :   0.00
 Layer   32 :TIDL_FlattenLayer, PASSED  #MMACs =     0.00,     0.00, Sparsity :   0.00
 Layer   33 : Out Q :     3079 , TIDL_ConvolutionLayer, PASSED  #MMACs =    18.87,    18.87, Sparsity :   0.00
 Layer   34 :TIDL_FlattenLayer, PASSED  #MMACs =     0.01,     0.01, Sparsity :   0.00
 Layer   35 : Out Q :    10912 , TIDL_ConvolutionLayer, PASSED  #MMACs =     0.88,     0.88, Sparsity :   0.00
 Layer   36 :TIDL_FlattenLayer, PASSED  #MMACs =     0.00,     0.00, Sparsity :   0.00
 Layer   37 : Out Q :     3436 , TIDL_ConvolutionLayer, PASSED  #MMACs =     4.72,     4.72, Sparsity :   0.00
 Layer   38 :TIDL_FlattenLayer, PASSED  #MMACs =     0.00,     0.00, Sparsity :   0.00
 Layer   39 : Out Q :    18888 , TIDL_ConvolutionLayer, PASSED  #MMACs =     0.15,     0.15, Sparsity :   0.00
 Layer   40 :TIDL_FlattenLayer, PASSED  #MMACs =     0.00,     0.00, Sparsity :   0.00
 Layer   41 : Out Q :     4465 , TIDL_ConvolutionLayer, PASSED  #MMACs =     0.88,     0.88, Sparsity :   0.00
 Layer   42 :TIDL_FlattenLayer, PASSED  #MMACs =     0.00,     0.00, Sparsity :   0.00
 Layer   43 : Out Q :    19092 , TIDL_ConvolutionLayer, PASSED  #MMACs =     0.04,     0.04, Sparsity :   0.00
 Layer   44 :TIDL_FlattenLayer, PASSED  #MMACs =     0.00,     0.00, Sparsity :   0.00
 Layer   45 : Out Q :     4767 , TIDL_ConvolutionLayer, PASSED  #MMACs =     0.22,     0.22, Sparsity :   0.00
 Layer   46 :TIDL_FlattenLayer, PASSED  #MMACs =     0.00,     0.00, Sparsity :   0.00
 Layer   47 : Out Q :     3380 , TIDL_ConcatLayer, PASSED  #MMACs =     0.00,     0.00, Sparsity :    nan
 Layer   48 : Out Q :     3091 , TIDL_ConcatLayer, PASSED  #MMACs =     0.00,     0.00, Sparsity :    nan
 Layer   49 :
Target: number label value xmin  ymin  xmax  ymax
Target:  0.00  7.00  0.92  0.71  0.37  1.00  0.73
Target:  1.00  7.00  0.11  0.81  0.36  1.00  0.47
Target:  2.00 15.00  0.10  0.24  0.38  0.28  0.45
Target:  3.00 15.00  0.10  0.35  0.64  0.46  0.78
Target:  4.00 15.00  0.09  0.81  0.59  0.84  0.70
Target:  5.00  7.00  0.09  0.13  0.00  0.25  0.11
Target:  6.00  7.00  0.08  0.82  0.39  0.99  0.55
Target:  7.00  7.00  0.08  0.69  0.41  0.90  0.51
Target:  8.00  7.00  0.08  0.69  0.37  0.90  0.46
Target:  9.00 15.00  0.08  0.70  0.40  0.76  0.48
Target: 10.00 15.00  0.08  0.68  0.42  0.72  0.48
Target: 11.00 15.00  0.08  0.39  0.38  0.43  0.46
Target: 12.00  3.00  0.08  0.36  0.66  0.44  0.74
Target: 13.00  4.00  0.07  0.46  0.01  0.60  0.04
Target: 14.00 15.00  0.07  0.71  0.39  0.75  0.46
Target: 15.00  9.00  0.07  0.72  0.51  0.82  0.65
Target: 16.00 15.00  0.07  0.94  0.28  0.99  0.38
Target: 17.00 15.00  0.07  0.38  0.37  0.44  0.43
Target: 18.00 15.00  0.07  0.25  0.39  0.29  0.45
Target: 19.00  4.00  0.07  0.31  0.02  0.46  0.06
Target: 20.00 15.00  0.07  0.68  0.29  0.73  0.39
Target: 21.00 15.00  0.07  0.67  0.39  0.77  0.49
Target: 22.00 15.00  0.07  0.36  0.67  0.45  0.88
Target: 23.00  7.00  0.07  0.62  0.40  0.79  0.52
Target: 24.00 15.00  0.06  0.75  0.18  0.80  0.29
Target: 25.00 15.00  0.06  0.42  0.39  0.47  0.47
Target: 26.00  9.00  0.06  0.72  0.00  0.98  0.25
Target: 27.00 15.00  0.06  0.85  0.38  0.91  0.47
Target: 28.00 16.00  0.06  0.92  0.00  1.01  0.13
Target: 29.00 15.00  0.06  0.69  0.26  0.74  0.33
Target: 30.00 15.00  0.06  0.44  0.20  0.47  0.28
Target: 31.00  4.00  0.06  0.48 -0.01  0.58  0.03
Target: 32.00  7.00  0.06  0.85  0.41  0.98  0.66
Target: 33.00 15.00  0.06  0.93  0.32  1.00  0.57
Target: 34.00  5.00  0.06  0.86  0.47  0.91  0.59
Target: 35.00  4.00  0.06  0.40  0.00  0.53  0.04
Target: 36.00 15.00  0.06  0.40  0.37  0.45  0.46
Target: 37.00  7.00  0.06  0.19  0.44  0.26  0.50
Target: 38.00 15.00  0.06  0.44  0.39  0.49  0.46
Target: 39.00 15.00  0.06  0.45  0.28  0.50  0.41
Target: 40.00 15.00  0.06  0.17  0.32  0.20  0.45
Target: 41.00 15.00  0.06  0.29  0.40  0.33  0.46
Target: 42.00 15.00  0.06  0.28  0.40  0.31  0.45
Target: 43.00  7.00  0.06  0.18 -0.00  0.28  0.05
Target: 44.00 15.00  0.06  0.95  0.05  1.00  0.18
Target: 45.00 15.00  0.06  0.18  0.01  0.24  0.11
Target: 46.00 15.00  0.06  0.67  0.40  0.73  0.47
Target: 47.00 15.00  0.06  0.70  0.23  0.73  0.28
Target: 48.00  7.00  0.06  0.33  0.40  0.40  0.46
Target: 49.00  4.00  0.06  0.37  0.49  0.48  0.53
Target: 50.00 15.00  0.06  0.70  0.26  0.73  0.30
Target: 51.00  7.00  0.06  0.22  0.43  0.34  0.50
Target: 52.00 15.00  0.06  0.10  0.20  0.97  0.93
Target: 53.00  4.00  0.06 -0.00  0.45  0.12  0.54
Target: 54.00  4.00  0.06  0.00  0.47  0.09  0.53
Target: 55.00  7.00  0.06  0.72  0.37  0.83  0.60
Target: 56.00 15.00  0.06  0.01  0.41  0.71  1.01
Target: 57.00 15.00  0.06  0.87  0.37  0.92  0.46
Target: 58.00 15.00  0.06  0.18  0.14  0.22  0.20
Target: 59.00 15.00  0.06  0.17  0.15  0.20  0.19
Target: 60.00 15.00  0.06  0.74  0.18  0.78  0.27
Target: 61.00 15.00  0.06  0.17  0.14  0.20  0.21
Target: 62.00  7.00  0.06  0.69  0.41  0.77  0.63
Target: 63.00 15.00  0.06  0.72  0.16  0.77  0.24
Target: 64.00 15.00  0.05  0.73  0.39  0.78  0.48
Target: 65.00 15.00  0.05  0.13  0.11  0.17  0.15
Target: 66.00  3.00  0.05  0.33  0.56  0.40  0.61
Target: 67.00  7.00  0.05  0.63  0.36  0.80  0.47
Target: 68.00 15.00  0.05  0.70  0.38  0.74  0.46
Target: 69.00 15.00  0.05  0.44  0.17  0.48  0.28
Target: 70.00 15.00  0.05  0.23  0.00  0.28  0.10
Target: 71.00 15.00  0.05  0.34  0.32  0.39  0.40
Target: 72.00  4.00  0.05  0.36  0.37  0.46  0.46
Target: 73.00 15.00  0.05  0.55  0.40  0.57  0.46
Target: 74.00 15.00  0.05  0.37  0.35  0.40  0.40
Target: 75.00 15.00  0.05  0.30  0.40  0.35  0.46
Target: 76.00 15.00  0.05  0.25  0.08  0.29  0.15
Target: 77.00 15.00  0.05  0.58  0.39  0.63  0.48
Target: 78.00 15.00  0.05  0.91  0.39  0.96  0.46
Target: 79.00 15.00  0.05  0.46  0.22  0.50  0.29
Target: 80.00 15.00  0.05  0.37  0.40  0.40  0.46
Target: 81.00  4.00  0.05  0.03  0.42  0.15  0.51
Target: 82.00 15.00  0.05  0.58  0.35  0.67  0.47
Target: 83.00 15.00  0.05  0.37  0.33  0.47  0.45
Target: 84.00 15.00  0.05  0.68  0.26  0.72  0.32
Target: 85.00  4.00  0.05  0.33  0.40  0.45  0.49
Target: 86.00 15.00  0.05  0.82  0.39  0.92  0.49
Target: 87.00 15.00  0.05  0.68  0.43  0.72  0.52
Target: 88.00 15.00  0.05  0.86  0.06  0.91  0.20
Target: 89.00  9.00  0.05 -0.00  0.37  0.11  0.49
Target: 90.00 15.00  0.05  0.45  0.31  0.54  0.43
Target: 91.00 15.00  0.05  0.30  0.40  0.32  0.44
Target: 92.00 15.00  0.05  0.44  0.32  0.49  0.42
Target: 93.00 15.00  0.05  0.43  0.22  0.45  0.27
Target: 94.00 15.00  0.05  0.43  0.22  0.47  0.34
Target: 95.00  7.00  0.05  0.94  0.39  1.00  0.48
Target: 96.00  7.00  0.05  0.18  0.00  0.29  0.12
Target: 97.00 15.00  0.05  0.31  0.30  0.42  0.42
Target: 98.00  7.00  0.05  0.51  0.36  0.93  0.73
Target: 99.00 16.00  0.05  0.77  0.20  0.92  0.35
Target: 100.00  3.00  0.05  0.32  0.58  0.39  0.63
Target: 101.00 15.00  0.05  0.18  0.15  0.21  0.19
Target: 102.00 15.00  0.05  0.87  0.02  0.92  0.14
Target: 103.00 16.00  0.05  0.94  0.03  0.99  0.11
Target: 104.00 15.00  0.05  0.56  0.39  0.59  0.45
Target: 105.00 15.00  0.05  0.67  0.41  0.70  0.48
Target: 106.00 15.00  0.05  0.89  0.36  0.94  0.46
Target: 107.00 15.00  0.05  0.53  0.40  0.56  0.46
Target: 108.00  4.00  0.05  0.38  0.02  0.51  0.05
Target: 109.00 15.00  0.05  0.91  0.28  1.01  0.40
Target: 110.00 15.00  0.05  0.47  0.31  0.51  0.40
Target: 111.00 15.00  0.05  0.70  0.21  0.73  0.27
Target: 112.00 15.00  0.05  0.56  0.21  0.59  0.25
Target: 113.00 15.00  0.05  0.47  0.22  0.51  0.30
Target: 114.00 15.00  0.05  0.72  0.38  0.81  0.49
Target: 115.00 15.00  0.05  0.85  0.28  0.91  0.38
Target: 116.00 15.00  0.05  0.20  0.37  0.25  0.46
Target: 117.00 15.00  0.05  0.67  0.02  0.72  0.09
Target: 118.00  5.00  0.05  0.83  0.46  0.92  0.59
Target: 119.00 15.00  0.05  0.93  0.39  0.97  0.46
Target: 120.00 15.00  0.05  0.49  0.19  0.53  0.25
Target: 121.00 15.00  0.05  0.70  0.40  0.79  0.52
Target: 122.00 15.00  0.05  0.82  0.60  0.87  0.70
Target: 123.00 15.00  0.05  0.83  0.37  0.97  0.47
Target: 124.00 15.00  0.05  0.43  0.20  0.45  0.25
Target: 125.00 15.00  0.05  0.47  0.40  0.51  0.45
Target: 126.00  7.00  0.05  0.20  0.39  0.29  0.45
Target: 127.00 15.00  0.05  0.31  0.40  0.34  0.44
Target: 128.00 15.00  0.05  0.42  0.19  0.51  0.43
Target: 129.00  7.00  0.05  0.96  0.20  1.01  0.27
Target: 130.00 15.00  0.05  0.63  0.11  0.67  0.18
Target: 131.00 15.00  0.05  0.68  0.27  0.73  0.35
Target: 132.00 15.00  0.05  0.26  0.05  0.31  0.12
Target: 133.00  7.00  0.05  0.74  0.50  0.95  0.68
Target: 134.00 15.00  0.05  0.42  0.68  0.52  0.91
Target: 135.00 19.00  0.05  0.02  0.28  0.38  0.57
Target: 136.00  7.00  0.05  0.10 -0.01  0.23  0.07
Target: 137.00 15.00  0.05  0.77  0.18  0.81  0.27
Target: 138.00 15.00  0.05  0.73  0.13  0.78  0.23
Target: 139.00 15.00  0.05  0.81  0.00  0.86  0.12
Target: 140.00  3.00  0.05  0.30  0.59  0.39  0.64
Target: 141.00  5.00  0.05  0.84  0.48  0.89  0.58
Target: 142.00 15.00  0.05  0.19  0.12  0.24  0.19
Target: 143.00 15.00  0.05  0.11  0.10  0.16  0.14
Target: 144.00  7.00  0.05  0.74  0.51  1.05  0.81
Target: 145.00 15.00  0.05  0.70  0.45  0.75  0.55
Target: 146.00  4.00  0.05  0.30  0.01  0.44  0.04
Target: 147.00 15.00  0.05  0.17  0.14  0.20  0.17
Target: 148.00  3.00  0.05  0.30  0.50  0.37  0.55
Target: 149.00  3.00  0.05  0.30  0.56  0.41  0.65
Target: 150.00 15.00  0.05  0.54  0.05  0.57  0.09
Target: 151.00 15.00  0.05  0.66  0.25  0.71  0.32
Target: 152.00  9.00  0.05  0.60  0.01  0.97  0.39
Target: 153.00 15.00  0.05  0.68  0.47  0.72  0.54
Target: 154.00 15.00  0.05  0.60  0.38  0.66  0.48
Target: 155.00 15.00  0.05  0.13  0.34  0.17  0.46
Target: 156.00  9.00  0.05  0.85  0.01  0.97  0.24
Target: 157.00 15.00  0.05  0.26  0.09  0.32  0.17
Target: 158.00 15.00  0.05  0.56  0.04  0.59  0.09
Target: 159.00 15.00  0.05  0.68  0.23  0.72  0.28
Target: 160.00 15.00  0.05  0.55  0.37  0.65  0.48
Target: 161.00 15.00  0.05  0.96  0.04  1.01  0.13
Target: 162.00 15.00  0.05  0.39  0.36  0.42  0.41
Target: 163.00 15.00  0.05  0.69  0.26  0.71  0.30
Target: 164.00 15.00  0.05  0.18  0.13  0.21  0.17
Target: 165.00 15.00  0.05  0.28  0.04  0.33  0.10
Target: 166.00  7.00  0.05  0.54  0.66  0.74  0.78
Target: 167.00 15.00  0.05  0.45  0.17  0.50  0.26
Target: 168.00 15.00  0.05  0.47  0.19  0.51  0.26
Target: 169.00  4.00  0.05  0.35  0.50  0.45  0.54
Target: 170.00 15.00  0.05  0.37  0.38  0.40  0.42
Target: 171.00 15.00  0.05  0.65  0.40  0.68  0.47
Target: 172.00 15.00  0.05  0.82  0.26  0.92  0.39
Target: 173.00 15.00  0.05  0.69  0.44  0.76  0.50
Target: 174.00 15.00  0.05  0.82  0.38  0.89  0.46
Target: 175.00  4.00  0.05 -0.01  0.50  0.08  0.54
Target: 176.00  7.00  0.05  0.30  0.40  0.37  0.45
Target: 177.00 15.00  0.05  0.33  0.36  0.38  0.41
Target: 178.00 15.00  0.05  0.44  0.21  0.49  0.32
Target: 179.00 15.00  0.05  0.24  0.05  0.28  0.14
Target: 180.00 15.00  0.05  0.08  0.42  0.12  0.49
Target: 181.00 15.00  0.05  0.61  0.13  0.66  0.19
Target: 182.00 15.00  0.05  0.66  0.28  0.71  0.35
Target: 183.00 15.00  0.05  0.42  0.33  0.47  0.42
Target: 184.00 15.00  0.05  0.63  0.10  0.67  0.15
Target: 185.00 15.00  0.05  0.14  0.34  0.24  0.46
Target: 186.00 15.00  0.05  0.66  0.04  0.71  0.09
Target: 187.00  7.00  0.04  0.61  0.36  1.03  0.56
Target: 188.00  7.00  0.04  0.65  0.44  0.75  0.48
Target: 189.00 15.00  0.04  0.30  0.69  0.54  0.97
Target: 190.00  4.00  0.04  0.52 -0.01  0.63  0.03
Target: 191.00  9.00  0.04  0.78  0.69  1.00  0.99
Target: 192.00 15.00  0.04  0.18  0.22  0.20  0.28
Target: 193.00  9.00  0.04  0.80 -0.00  0.91  0.24
Target: 194.00  7.00  0.04  0.90  0.47  1.01  0.68
Target: 195.00  3.00  0.04  0.71  0.80  0.77  0.87
Target: 196.00  3.00  0.04  0.30  0.61  0.38  0.68
Target: 197.00  4.00  0.04  0.53  0.01  0.65  0.04
Target: 198.00  7.00  0.04  0.83 -0.00  0.92  0.06
Target: 199.00  3.00  0.04  0.03  0.25  0.08  0.31
 #MMACs =     0.01,     0.01, Sparsity :   0.00
End of config list found !