This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

how about the different of TIDL quantize with caffe-jacinto quantize

Other Parts Discussed in Thread: AM5728

Hi,

The caffe-jacinto support  do quantize in the inference phase, and TIDL import tool will do quantize when convert the model into *.bin. I want to know the different of them.

I have used the TIDL api to do forward propagate,I did not want to got the output of last layer(InnerProduct ) , but the output of innerProduct's previous layer, it represent the feature of the input.

I am confused that all of them are integer number of 0-255.  But, if I do  propagate with caffe-jacinto  on pc, the output is float number,

so, did I need to do invert quantize to convert the output of TIDL to float number when I compute Euclidean distance of input?

github.com/.../caffe-jacinto

thanks

best regards

  • Hi,

    Please refer to FAQ 10 in the user guide to understand output Q format and to convert to output to float point data.
    Also, you can refer to code under "ENABLE_FLOAT_TRACE'" in the tidl_tb.c file (which dupms the output in float )

    Thanks,
    Praveen
  • Hi Praveen,

    thanks for your reply, but I don't find the FAQ10, would you please give me a link ?

    thanks

    best regards

  • Hi,

    There is a user guide (TIDeepLearningLibrary_UserGuide.pdf) in "ti_dl\docs" folder. In that go to section 6 FAQ, and in that refer to Q10.

    Thanks,

    Praveen  

  • Hi Praveen,

    I have seen the FAQ10, but I have some question about it.

    I get the feature_map of image is signed float when I did propagate on pc with quantize:true in deploy.prototxt,  but unsigned char feature_map on board with TIDL ,

    even I divide the scale factor, the feature_map will not convert to signed float.

    what's more, this is my convert log, the out Q of last layer is 604, so my scale factor is 604/256 = 2.3593  ??

    convert_log.txt
    Fullscreen
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    1
    Processing config file .\tempDir\qunat_stats_config.txt !
    0, TIDL_DataLayer , 0, -1 , 1 , x , x , x , x , x , x , x , x , 0 , 0 , 0 , 0 , 0 , 1 , 3 , 112 , 96 ,
    1, TIDL_BatchNormLayer , 1, 1 , 1 , 0 , x , x , x , x , x , x , x , 1 , 1 , 3 , 112 , 96 , 1 , 3 , 112 , 96 ,
    2, TIDL_ConvolutionLayer , 1, 1 , 1 , 1 , x , x , x , x , x , x , x , 2 , 1 , 3 , 112 , 96 , 1 , 32 , 56 , 48 ,
    3, TIDL_ConvolutionLayer , 1, 1 , 1 , 2 , x , x , x , x , x , x , x , 3 , 1 , 32 , 56 , 48 , 1 , 32 , 28 , 24 ,
    4, TIDL_ConvolutionLayer , 1, 1 , 1 , 3 , x , x , x , x , x , x , x , 4 , 1 , 32 , 28 , 24 , 1 , 64 , 28 , 24 ,
    5, TIDL_ConvolutionLayer , 1, 1 , 1 , 4 , x , x , x , x , x , x , x , 5 , 1 , 64 , 28 , 24 , 1 , 64 , 14 , 12 ,
    6, TIDL_ConvolutionLayer , 1, 1 , 1 , 5 , x , x , x , x , x , x , x , 6 , 1 , 64 , 14 , 12 , 1 , 128 , 14 , 12 ,
    7, TIDL_ConvolutionLayer , 1, 1 , 1 , 6 , x , x , x , x , x , x , x , 7 , 1 , 128 , 14 , 12 , 1 , 128 , 7 , 6 ,
    8, TIDL_ConvolutionLayer , 1, 1 , 1 , 7 , x , x , x , x , x , x , x , 8 , 1 , 128 , 7 , 6 , 1 , 256 , 7 , 6 ,
    9, TIDL_ConvolutionLayer , 1, 1 , 1 , 8 , x , x , x , x , x , x , x , 9 , 1 , 256 , 7 , 6 , 1 , 256 , 3 , 3 ,
    10, TIDL_ConvolutionLayer , 1, 1 , 1 , 9 , x , x , x , x , x , x , x , 10 , 1 , 256 , 3 , 3 , 1 , 512 , 3 , 3 ,
    11, TIDL_ConvolutionLayer , 1, 1 , 1 , 10 , x , x , x , x , x , x , x , 11 , 1 , 512 , 3 , 3 , 1 , 512 , 3 , 3 ,
    12, TIDL_PoolingLayer , 1, 1 , 1 , 11 , x , x , x , x , x , x , x , 12 , 1 , 512 , 3 , 3 , 1 , 1 , 1 , 512 ,
    13, TIDL_InnerProductLayer , 1, 1 , 1 , 12 , x , x , x , x , x , x , x , 13 , 1 , 1 , 1 , 512 , 1 , 1 , 1 , 512 ,
    14, TIDL_DataLayer , 0, 1 , -1 , 13 , x , x , x , x , x , x , x , 0 , 1 , 1 , 1 , 512 , 0 , 0 , 0 , 0 ,
    Layer ID ,inBlkWidth ,inBlkHeight ,inBlkPitch ,outBlkWidth ,outBlkHeight,outBlkPitch ,numInChs ,numOutChs ,numProcInChs,numLclInChs ,numLclOutChs,numProcItrs ,numAccItrs ,numHorBlock ,numVerBlock ,inBlkChPitch,outBlkChPitc,alignOrNot
    2 104 36 104 48 14 48 3 32 3 1 8 1 3 1 4 3744 672 1
    3 56 16 56 48 14 48 8 8 8 4 8 1 2 1 4 896 672 1
    4 40 30 40 32 28 32 32 64 32 7 8 1 5 1 1 1200 896 1
    5 40 30 40 32 28 32 16 16 16 7 8 1 3 1 1 1200 896 1
    6 24 16 24 16 14 16 64 128 64 8 8 1 8 1 1 384 224 1
    7 24 16 24 16 14 16 32 32 32 8 8 1 4 1 1 384 224 1
    8 24 9 24 16 7 16 128 256 128 8 8 1 16 1 1 216 112 1
    9 24 8 24 16 6 16 64 64 64 8 8 1 8 1 1 192 96 1
    10 24 5 24 16 3 16 256 512 256 8 8 1 32 1 1 120 48 1
    11 24 5 24 16 3 16 128 128 128 8 8 1 16 1 1 120 48 1
    Processing Frame Number : 0
    Layer 1 : Out Q : 280 , TIDL_BatchNormLayer , PASSED #MMACs = 0.03, 0.03, Sparsity : 0.00
    Layer 2 : Out Q : 44696 , TIDL_ConvolutionLayer, PASSED #MMACs = 6.45, 3.38, Sparsity : 47.67
    Layer 3 : Out Q : 25963 , TIDL_ConvolutionLayer, PASSED #MMACs = 6.19, 2.12, Sparsity : 65.80
    Layer 4 : Out Q : 57240 , TIDL_ConvolutionLayer, PASSED #MMACs = 12.39, 2.83, Sparsity : 77.15
    Layer 5 : Out Q : 62339 , TIDL_ConvolutionLayer, PASSED #MMACs = 6.19, 2.03, Sparsity : 67.27
    Layer 6 : Out Q : 75942 , TIDL_ConvolutionLayer, PASSED #MMACs = 12.39, 2.70, Sparsity : 78.16
    Layer 7 : Out Q : 69657 , TIDL_ConvolutionLayer, PASSED #MMACs = 6.19, 1.59, Sparsity : 74.28
    Layer 8 : Out Q : 84570 , TIDL_ConvolutionLayer, PASSED #MMACs = 12.39, 2.61, Sparsity : 78.94
    Layer 9 : Out Q : 83507 , TIDL_ConvolutionLayer, PASSED #MMACs = 5.31, 1.18, Sparsity : 77.72
    Layer 10 : Out Q : 34892 , TIDL_ConvolutionLayer, PASSED #MMACs = 10.62, 2.15, Sparsity : 79.76
    Layer 11 : Out Q : 1515 , TIDL_ConvolutionLayer, PASSED #MMACs = 5.31, 0.84, Sparsity : 84.15
    Layer 12 : Out Q : 2066 , TIDL_PoolingLayer, PASSED #MMACs = 0.00, 0.00, Sparsity : 0.00
    Layer 13 : Out Q : 604 , TIDL_InnerProductLayer, PASSED #MMACs = 0.00, 0.00, Sparsity : 0.00
    End of config list found !
    Caffe Network File : .\deploy.prototxt
    Caffe Model File : .\sparse_iter_260000.caffemodel
    TIDL Network File : .\tidl_net_imagenet_jacintonet11v2.bin
    TIDL Model File : .\tidl_param_imagenet_jacintonet11v2.bin
    Name of the Network : jacintonet11v2_deploy
    Num Inputs : 1
    Num of Layer Detected : 14
    0, TIDL_DataLayer , data 0, -1 , 1 , x , x , x , x , x , x , x , x , 0 , 0 , 0 , 0 , 0 , 1 , 3 , 112 , 96 , 0 ,
    1, TIDL_BatchNormLayer , data/bias 1, 1 , 1 , 0 , x , x , x , x , x , x , x , 1 , 1 , 3 , 112 , 96 , 1 , 3 , 112 , 96 , 32256 ,
    2, TIDL_ConvolutionLayer , conv1a 1, 1 , 1 , 1 , x , x , x , x , x , x , x , 2 , 1 , 3 , 112 , 96 , 1 , 32 , 56 , 48 , 6451200 ,
    3, TIDL_ConvolutionLayer , conv1b 1, 1 , 1 , 2 , x , x , x , x , x , x , x , 3 , 1 , 32 , 56 , 48 , 1 , 32 , 28 , 24 , 6193152 ,
    4, TIDL_ConvolutionLayer , res2a_branch2a 1, 1 , 1 , 3 , x , x , x , x , x , x , x , 4 , 1 , 32 , 28 , 24 , 1 , 64 , 28 , 24 , 12386304 ,
    5, TIDL_ConvolutionLayer , res2a_branch2b 1, 1 , 1 , 4 , x , x , x , x , x , x , x , 5 , 1 , 64 , 28 , 24 , 1 , 64 , 14 , 12 , 6193152 ,
    6, TIDL_ConvolutionLayer , res3a_branch2a 1, 1 , 1 , 5 , x , x , x , x , x , x , x , 6 , 1 , 64 , 14 , 12 , 1 , 128 , 14 , 12 , 12386304 ,
    7, TIDL_ConvolutionLayer , res3a_branch2b 1, 1 , 1 , 6 , x , x , x , x , x , x , x , 7 , 1 , 128 , 14 , 12 , 1 , 128 , 7 , 6 , 6193152 ,
    8, TIDL_ConvolutionLayer , res4a_branch2a 1, 1 , 1 , 7 , x , x , x , x , x , x , x , 8 , 1 , 128 , 7 , 6 , 1 , 256 , 7 , 6 , 12386304 ,
    9, TIDL_ConvolutionLayer , res4a_branch2b 1, 1 , 1 , 8 , x , x , x , x , x , x , x , 9 , 1 , 256 , 7 , 6 , 1 , 256 , 3 , 3 , 6193152 ,
    10, TIDL_ConvolutionLayer , res5a_branch2a 1, 1 , 1 , 9 , x , x , x , x , x , x , x , 10 , 1 , 256 , 3 , 3 , 1 , 512 , 3 , 3 , 10616832 ,
    11, TIDL_ConvolutionLayer , res5a_branch2b 1, 1 , 1 , 10 , x , x , x , x , x , x , x , 11 , 1 , 512 , 3 , 3 , 1 , 512 , 3 , 3 , 5308416 ,
    12, TIDL_PoolingLayer , pool5 1, 1 , 1 , 11 , x , x , x , x , x , x , x , 12 , 1 , 512 , 3 , 3 , 1 , 1 , 1 , 512 , 4608 ,
    13, TIDL_InnerProductLayer , fc512 1, 1 , 1 , 12 , x , x , x , x , x , x , x , 13 , 1 , 1 , 1 , 512 , 1 , 1 , 1 , 512 , 262144 ,
    XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

    and this is my  signed float feature_map(five pictures) which get on pc.

    feature_test_pc_float.txt
    Fullscreen
    1
    2
    3
    4
    5
    -2.3334 -4.81361 1.60106 -9.30873 -0.493946 -4.75249 5.27968 6.08396 27.6322 -16.4587 -20.2266 1.56878 20.2298 -1.8637 4.7909 3.0148 -6.0638 2.54976 9.82974 -3.44838 -6.18992 -8.50193 -0.683365 -3.36583 -24.4481 7.60339 8.18641 9.81735 15.3494 35.649 5.32655 2.44044 -1.59215 11.2689 25.7513 18.3126 -12.3896 -13.4268 0.10453 -15.7165 -4.47136 18.3486 -3.26482 -36.8464 -0.432096 12.8037 3.39937 -8.10574 31.3493 -6.12645 2.93857 1.79671 21.3715 -2.00693 -25.1029 -3.82327 -11.8207 -8.14864 -16.9127 -3.22039 -20.7511 2.78417 -12.2627 -16.8245 0.444898 25.0366 -1.35278 -20.3338 -6.52115 1.98465 -13.8391 -14.3194 -5.48992 -14.1982 2.60098 -16.152 4.79302 20.5582 -5.2419 1.9436 -5.51969 -6.87787 1.58516 -16.8141 30.4352 -1.57302 -4.04291 23.9071 23.1974 -22.3076 2.97779 -17.3386 -56.8036 -1.66354 8.87112 -13.834 2.80118 17.4985 -4.80526 36.8068 -9.41476 3.15715 10.2094 -0.322741 -0.35931 -8.79215 0.626971 2.5425 5.0345 27.262 3.80363 -9.91583 -6.83387 -13.4228 -2.1281 -20.2668 -6.73206 20.6714 11.4042 9.29669 -3.5501 22.3547 -2.14618 7.99318 -1.97175 -3.33421 -1.76497 -12.6968 -1.63122 -2.39401 2.15151 38.4657 -11.6015 11.119 8.5653 1.63941 -18.3128 -4.18314 11.79 -0.291191 16.4825 3.20393 9.39609 2.20761 9.09408 7.74085 1.78107 -0.933219 15.5265 -1.30823 29.405 9.81641 6.36743 9.4528 -17.0515 20.42 -7.15946 -7.65383 4.63895 14.7602 -14.0502 -4.0508 -3.60522 0.491613 10.6244 -5.8734 -12.9944 4.61909 16.2477 -3.13905 28.1426 -14.9581 13.16 -2.07341 -11.7324 24.8041 -12.9533 29.2236 -6.46296 -5.34815 -13.2047 53.8642 21.269 13.2226 6.55779 -3.47197 -4.26313 -26.351 11.1354 -14.12 -8.80242 -9.54684 -3.89718 24.5648 50.116 -22.2132 -6.82456 6.02304 -17.4577 26.1397 -1.80863 1.98271 -3.38223 -6.82662 5.27251 9.80777 2.81638 -16.7506 -6.85512 -28.7844 9.48499 -2.59856 2.48267 -2.59536 0.582695 -1.10742 -38.071 -5.37845 10.2263 6.45528 3.16281 11.6214 14.1071 1.01091 39.7115 -12.8078 14.0483 33.8953 6.37947 -18.4839 -7.396 11.9408 -0.718152 13.8886 1.0739 -2.41918 8.28947 1.30642 -0.790641 11.3302 -0.0799343 -1.6119 -6.389 1.19512 -9.4305 -5.60394 5.98932 13.7375 -1.55099 -2.45518 15.3358 -4.231 -10.1379 -3.48792 -11.2466 -2.04221 -12.5989 -6.21459 -6.46298 4.73988 8.6152 -14.1917 -9.7783 2.3839 6.7981 2.25158 -6.59558 -0.689742 -26.4607 15.7008 -5.70325 11.8557 -12.88 14.8754 -0.189722 -21.9042 5.63091 -9.23771 3.95701 0.807939 6.87044 4.8605 -10.7141 12.418 -1.86311 -0.703352 -4.38501 9.99384 -6.62351 -17.9549 1.29476 0.810474 -18.0173 11.1637 12.1175 5.89883 -3.65851 -2.26226 7.06022 2.95869 5.65398 24.076 32.1941 1.22461 9.47546 -0.40749 -3.72973 -3.11252 -12.0489 34.2253 -0.878202 23.5413 15.0571 -5.33715 2.85414 -19.7179 -6.43657 -11.8562 -2.10077 0.0604749 34.0017 -2.42134 3.24926 -2.65501 6.42227 -8.97397 15.6429 22.4047 8.75416 29.2057 -6.64846 19.4148 -18.7468 3.63491 0.828373 10.3141 -1.52132 -0.910444 -20.9111 3.52483 -0.568277 -1.60871 -3.69154 -17.7355 -3.22047 -2.17983 -2.22577 3.31486 1.53489 16.9885 16.8004 -1.0627 -10.6325 6.41251 -5.3592 16.3713 -14.8237 27.6976 7.55428 1.91238 11.4223 2.4167 -6.44436 26.0733 0.667094 8.44119 -1.33141 3.54791 7.72181 -40.7186 -1.62548 -17.5429 6.93805 29.1319 0.113849 -1.0492 21.0603 13.8555 5.46937 -10.0604 -24.2068 -1.90445 4.68412 -27.8554 11.2131 4.4025 -0.819602 -11.861 17.7316 -11.0856 -36.8061 2.12829 11.2128 5.248 15.206 14.4841 -5.60177 1.62917 10.6194 12.4003 2.05164 -7.33833 14.7466 -2.51764 -21.5782 -1.38297 3.88603 3.46104 5.9525 5.01948 7.22153 -1.06942 14.6674 -16.6386 -16.1606 29.0062 16.7019 6.70052 43.9636 -11.743 14.0494 -8.64186 -20.6555 -33.1825 9.25403 -21.0165 9.951 1.32858 -8.51196 -0.547932 -0.870848 15.479 -2.37392 -2.66762 -14.7011 35.5043 -4.79761 27.9751 -10.8951 -26.2556 28.8044 -3.23609 7.07136 -8.35432 -13.0011 -7.0073 19.0487 21.1109 -17.7706 -8.73993 -19.0763 -9.67319 8.9338 16.355 8.2177 -2.17686 31.1172 -12.9435 6.86955 -0.240397 8.89036 -0.918987 10.1819 2.47727 29.0431 17.0378 11.1685 -28.2493 21.5075 4.5157 -33.0988 -16.6525 -19.5829 0.421631 -3.02243 27.2616 5.66694 23.6034 14.31 -10.5839 4.54701 8.10712 -0.130011 11.0431 7.58234 -7.60033 7.84467 5.7101 2.24755 0.282316 -13.809 -8.39266 0.942529 -4.82941 5.60393 30.2369 14.9752 2.66381 9.03682 4.78893 -19.4477 -7.75029 4.19981 11.1455 -13.0899 2.48952 -0.357618 46.71 26.9566 7.27536 11.9147 -6.9309
    -7.82728 3.35455 -9.50455 6.15 -11.1818 -7.26818 25.7182 1.67727 22.3636 -13.4182 -3.35455 -0 2.79546 4.47273 -2.23636 -4.47273 -2.23636 12.8591 17.8909 -5.59091 -6.15 -8.94546 1.67727 -19.0091 6.70909 1.67727 9.50455 -3.35455 19.5682 10.0636 3.91364 -1.11818 13.9773 -20.1273 -16.7727 11.1818 -5.03182 -6.15 2.23636 -10.0636 1.11818 8.94546 2.23636 5.59091 -30.75 -16.7727 5.59091 5.03182 -23.4818 2.79546 5.59091 0.559091 9.50455 -11.1818 29.0727 -8.38637 19.0091 16.2136 -8.94546 3.35455 -0.559091 17.3318 -17.3318 0.559091 15.0955 -15.0955 10.6227 -13.4182 -3.35455 -27.9546 -10.6227 15.0955 -4.47273 3.35455 -7.82728 -6.70909 -2.23636 -2.23636 -3.91364 8.94546 -2.79546 -41.3727 8.38637 -16.2136 7.82728 -10.0636 0.559091 -16.2136 -19.5682 -30.1909 -0.559091 -38.5773 29.6318 7.26818 0 -12.3 -8.94546 -1.67727 -27.9546 -46.9637 -0.559091 1.67727 1.11818 6.70909 -8.94546 4.47273 -3.91364 6.70909 12.3 24.0409 -5.59091 -17.8909 -19.5682 2.23636 1.11818 24.6 0.559091 1.67727 2.79546 -3.91364 -7.26818 30.75 -1.11818 5.59091 8.94546 -20.6864 4.47273 17.3318 -1.67727 5.59091 -6.15 30.75 -48.6409 10.6227 -6.15 5.59091 -32.4273 -8.94546 -0.559091 6.15 -21.2455 -6.15 12.8591 1.11818 -10.6227 -7.26818 -15.6546 12.3 7.82728 1.11818 -19.5682 6.15 -19.0091 -19.5682 -33.5455 -26.2773 -12.3 16.7727 2.23636 23.4818 -13.9773 20.1273 1.11818 -2.79546 -2.23636 11.1818 6.15 -3.91364 3.91364 1.67727 -22.3636 12.8591 -8.94546 -5.03182 21.8046 5.59091 -1.67727 7.82728 -19.5682 7.26818 -27.3955 5.59091 -30.75 17.8909 -22.3636 -2.79546 -10.6227 -5.03182 6.15 -25.7182 32.9864 -3.91364 11.7409 21.8046 44.1682 1.67727 16.2136 12.8591 13.9773 69.3273 -2.79546 -12.8591 -1.67727 2.79546 0 -11.7409 6.70909 -5.03182 6.15 -15.6546 10.6227 1.11818 2.23636 -13.9773 0.559091 -11.7409 -6.15 22.3636 -12.3 6.70909 1.11818 -1.11818 0 2.79546 9.50455 -34.1046 -1.67727 7.82728 -5.03182 -21.2455 9.50455 12.8591 -4.47273 20.6864 -13.4182 3.35455 23.4818 -6.70909 -25.7182 -11.7409 17.3318 -17.8909 5.03182 -0 -12.8591 16.2136 13.9773 3.35455 -3.91364 3.35455 8.38637 -0.559091 16.2136 -0 -4.47273 -5.03182 8.38637 5.59091 -16.2136 -10.6227 3.35455 -8.94546 -33.5455 6.15 16.2136 10.6227 -11.1818 3.35455 -7.26818 -15.6546 24.6 36.9 6.15 -19.5682 3.91364 21.2455 36.9 15.6546 5.59091 -2.79546 -1.11818 35.2227 -36.3409 10.0636 -2.23636 19.0091 12.3 -2.23636 -27.9546 10.6227 8.94546 21.2455 15.0955 9.50455 8.38637 -22.3636 -10.0636 6.15 19.0091 10.0636 9.50455 28.5136 1.11818 8.94546 -7.82728 1.11818 21.8046 2.23636 3.91364 6.70909 0.559091 -13.4182 11.1818 -7.82728 1.11818 -28.5136 5.59091 3.91364 -2.79546 -0 -11.7409 6.70909 3.35455 2.79546 -1.67727 -6.70909 20.1273 17.3318 10.0636 27.3955 -6.70909 -16.2136 20.6864 -20.6864 -1.67727 -12.3 -4.47273 -2.23636 -3.35455 -5.59091 19.0091 -5.03182 -13.4182 5.59091 -3.35455 5.03182 1.11818 0.559091 -11.7409 12.8591 9.50455 -0.559091 -43.6091 1.67727 8.38637 9.50455 7.82728 -26.2773 -10.0636 2.23636 13.9773 14.5364 -4.47273 -7.82728 -9.50455 -1.67727 -2.79546 -1.11818 3.91364 -0.559091 40.8136 -15.6546 -15.0955 8.38637 15.6546 1.67727 -8.94546 24.6 6.15 -19.0091 -10.6227 2.79546 -0 -12.3 1.11818 -0 -8.38637 26.2773 -6.70909 33.5455 -24.6 -1.67727 3.91364 22.9227 10.6227 -27.9546 -4.47273 -5.03182 -8.94546 2.79546 -11.1818 5.59091 16.2136 11.1818 13.9773 -3.91364 -10.0636 -4.47273 -2.79546 19.0091 -0.559091 -4.47273 30.1909 0 -15.0955 17.3318 17.3318 7.82728 3.91364 -0.559091 4.47273 -7.26818 -17.3318 -15.6546 43.6091 11.1818 -2.79546 1.67727 -24.6 25.7182 -0.559091 12.8591 -10.6227 5.59091 34.1046 -26.2773 -3.35455 -8.38637 -2.23636 2.79546 -43.05 -10.6227 13.4182 -6.15 -8.38637 -21.2455 2.79546 -9.50455 36.3409 -17.3318 7.26818 -25.7182 11.1818 1.11818 21.8046 -9.50455 44.7273 -9.50455 -12.3 -0 -12.8591 -2.79546 -4.47273 -3.91364 56.4682 1.67727 -8.94546 -0.559091 0 -3.91364 18.45 21.2455 -5.03182 15.0955 8.94546 -6.15 -8.38637 -3.35455 -12.8591 0.559091 3.35455 -3.35455 -3.91364 -14.5364 -6.15 7.82728 7.82728 -12.3 8.94546 -12.8591 13.4182 10.6227 -2.23636 28.5136 9.50455 -10.6227 -10.6227 -13.4182 8.94546 1.11818 28.5136 4.47273 -6.15 7.82728 32.4273 22.9227 5.03182 -4.47273 -7.82728 15.6546 10.0636 -9.50455
    -1.67486 -3.90801 1.67486 -9.49089 -0 -4.4663 4.4663 6.14116 28.4727 -16.7486 -20.0984 1.67486 20.0984 -1.67486 5.58288 3.34973 -5.58288 2.23315 10.0492 -3.90801 -6.14116 -8.37431 -0 -2.23315 -24.0064 7.81603 8.37431 10.6075 15.0738 34.0555 6.14116 2.23315 -1.67486 12.2823 25.6812 18.4235 -11.724 -12.8406 0 -15.0738 -5.02459 15.6321 -2.79144 -36.2887 0 12.8406 3.90801 -7.81603 31.8224 -5.58288 3.34973 2.23315 21.2149 -1.67486 -24.5647 -3.34973 -12.2823 -7.25774 -16.7486 -2.79144 -18.9818 2.23315 -12.2823 -16.1903 0.558288 25.1229 -0.558288 -19.5401 -6.14116 2.79144 -12.8406 -13.9572 -5.02459 -13.9572 2.79144 -15.6321 5.02459 21.2149 -5.02459 2.23315 -5.02459 -6.69945 1.67486 -17.3069 30.1475 -2.23315 -3.90801 23.4481 24.0064 -22.3315 3.34973 -17.3069 -55.8288 -1.11658 8.9326 -13.9572 2.79144 17.3069 -4.4663 36.2887 -9.49089 2.79144 10.6075 -0 -0.558288 -8.37431 0.558288 1.67486 5.02459 26.7978 3.34973 -9.49089 -6.14116 -12.8406 -1.67486 -20.0984 -6.69945 20.0984 11.724 9.49089 -3.34973 22.8898 -1.67486 8.37431 -1.67486 -3.34973 -1.67486 -13.3989 -1.11658 -2.23315 2.23315 39.0801 -11.724 11.724 8.37431 2.23315 -17.3069 -3.90801 12.2823 0.558288 16.7486 3.34973 9.49089 2.79144 8.9326 7.25774 1.67486 -0 15.6321 -1.11658 30.1475 10.6075 6.69945 9.49089 -17.3069 19.5401 -6.14116 -8.37431 4.4663 15.0738 -13.3989 -3.90801 -2.79144 0.558288 10.0492 -5.58288 -13.3989 5.02459 15.6321 -2.79144 28.4727 -14.5155 13.3989 -1.67486 -10.6075 25.1229 -12.2823 29.031 -6.69945 -5.58288 -12.8406 55.2705 21.2149 13.3989 7.25774 -2.79144 -3.90801 -25.6812 11.1658 -13.9572 -10.0492 -9.49089 -3.34973 25.1229 49.6876 -22.3315 -6.69945 6.14116 -17.3069 26.2395 -1.67486 2.79144 -3.34973 -6.69945 6.14116 9.49089 2.79144 -15.6321 -6.69945 -27.9144 10.0492 -2.79144 2.79144 -3.90801 1.11658 -1.11658 -37.9636 -5.02459 10.0492 6.69945 3.34973 11.724 14.5155 1.11658 38.5218 -12.2823 15.0738 34.6138 6.69945 -18.4235 -6.14116 11.724 0 13.9572 1.67486 -2.23315 8.9326 1.67486 -1.11658 12.2823 -0.558288 -0.558288 -6.14116 1.67486 -8.9326 -4.4663 6.14116 12.8406 -1.11658 -2.23315 16.1903 -3.90801 -10.0492 -3.34973 -11.1658 -1.11658 -12.2823 -5.58288 -6.14116 4.4663 8.9326 -14.5155 -8.9326 2.23315 7.25774 2.79144 -6.69945 -0.558288 -26.2395 15.6321 -5.58288 11.1658 -12.8406 14.5155 0.558288 -21.7732 6.14116 -8.9326 3.90801 1.11658 7.25774 5.02459 -11.724 13.3989 -1.67486 -0 -4.4663 10.0492 -6.69945 -17.3069 1.11658 1.11658 -18.4235 11.724 12.2823 5.02459 -3.34973 -2.23315 7.25774 3.34973 6.69945 25.1229 32.3807 1.11658 10.0492 -0 -3.34973 -2.79144 -11.724 34.0555 -0.558288 24.0064 15.0738 -5.02459 3.34973 -18.9818 -6.69945 -12.2823 -1.67486 0.558288 34.0555 -1.67486 3.34973 -2.23315 6.69945 -8.37431 15.6321 21.7732 8.9326 28.4727 -5.02459 19.5401 -17.3069 3.90801 1.11658 10.0492 -1.11658 -0.558288 -20.0984 3.90801 -0.558288 -1.67486 -3.34973 -17.8652 -2.79144 -1.67486 -1.67486 3.90801 0.558288 16.7486 17.3069 -1.11658 -9.49089 6.69945 -5.02459 15.6321 -14.5155 27.9144 8.37431 2.23315 11.724 2.23315 -6.14116 25.6812 0.558288 8.37431 -1.67486 3.90801 7.81603 -40.1967 -1.11658 -17.8652 7.25774 29.5892 1.11658 -0.558288 21.2149 14.5155 5.58288 -11.1658 -24.5647 -1.67486 4.4663 -26.7978 11.724 4.4663 0.558288 -11.724 17.3069 -10.6075 -35.7304 2.23315 11.1658 5.58288 15.0738 13.9572 -5.58288 2.23315 11.724 12.2823 2.23315 -7.25774 15.6321 -2.79144 -21.2149 -1.11658 4.4663 3.34973 6.69945 5.58288 6.69945 -0.558288 15.0738 -17.3069 -16.7486 29.031 17.3069 6.69945 44.663 -11.1658 13.9572 -8.9326 -20.6566 -32.3807 8.9326 -20.6566 10.0492 1.67486 -7.25774 -0.558288 -0.558288 15.6321 -2.23315 -2.79144 -13.3989 35.7304 -4.4663 27.9144 -10.6075 -25.6812 28.4727 -2.79144 7.25774 -7.81603 -12.8406 -6.14116 18.9818 20.6566 -17.8652 -9.49089 -18.4235 -8.9326 9.49089 16.7486 8.37431 -1.67486 32.3807 -12.8406 6.69945 0 8.37431 -0.558288 11.1658 2.79144 27.9144 16.7486 11.1658 -27.9144 21.2149 5.02459 -32.939 -16.1903 -20.0984 -0.558288 -2.79144 26.7978 5.58288 24.0064 15.0738 -10.6075 5.02459 8.37431 0 11.724 7.81603 -7.25774 7.81603 6.14116 2.79144 1.11658 -13.3989 -7.25774 1.11658 -4.4663 7.25774 30.1475 15.0738 2.23315 9.49089 5.02459 -19.5401 -7.25774 4.4663 9.49089 -12.2823 2.79144 -0 45.7796 26.7978 7.25774 12.2823 -6.14116
    -2.2328 20.0952 -3.9074 17.8624 -3.3492 -11.7222 -19.537 -3.3492 -2.791 15.0714 3.3492 4.4656 21.2116 -1.1164 6.6984 4.4656 -7.2566 37.3994 32.9338 -5.582 4.4656 -11.7222 3.3492 -37.3994 -6.6984 -3.3492 -29.5846 6.6984 -6.6984 -6.1402 -8.9312 -1.1164 9.4894 32.3756 26.7936 22.328 7.2566 -6.6984 7.2566 5.582 6.6984 -3.3492 22.8862 -17.3042 -16.746 1.1164 -1.1164 1.1164 -38.5158 1.1164 6.6984 2.791 5.0238 -13.3968 2.2328 -10.0476 10.0476 3.3492 8.9312 10.0476 17.3042 6.1402 7.8148 16.746 5.582 -4.4656 2.2328 12.2804 12.8386 5.0238 -15.0714 6.6984 -27.3518 -1.6746 -8.9312 1.1164 10.6058 1.6746 0.5582 -3.9074 -6.1402 17.3042 15.6296 -10.0476 7.8148 -13.955 1.6746 -17.3042 -8.373 -45.2142 5.582 -44.0978 19.537 -7.2566 -12.2804 6.1402 -1.6746 -28.4682 -27.91 2.2328 6.1402 7.8148 9.4894 4.4656 -10.0476 5.582 3.9074 -9.4894 0.5582 -15.6296 24.5608 -8.9312 -19.537 -34.0502 -8.9312 25.6772 -0.5582 3.9074 10.0476 -13.3968 -8.373 21.2116 -4.4656 5.582 -2.791 -21.7698 20.0952 -10.0476 -2.2328 -1.1164 2.791 2.2328 -2.791 -12.8386 -26.7936 9.4894 -8.373 -9.4894 46.8888 29.5846 6.6984 -3.9074 21.7698 7.8148 -10.0476 -4.4656 22.328 -40.1904 -6.1402 -5.582 -2.2328 -5.582 -8.373 -13.3968 -14.5132 2.791 5.582 21.2116 3.3492 27.3518 5.582 -0.5582 6.1402 1.1164 8.9312 2.2328 -8.373 -2.791 11.7222 -19.537 2.791 -1.1164 10.6058 -3.3492 -1.1164 -13.3968 5.0238 7.8148 -1.1164 -0.5582 -25.119 -7.8148 -15.0714 10.0476 31.8174 -10.0476 -17.3042 -13.955 8.373 -11.164 21.2116 0.5582 6.6984 53.5872 0 -0.5582 -17.3042 2.791 13.955 -21.2116 15.0714 -13.3968 17.8624 11.7222 0.5582 -5.0238 4.4656 -51.3544 -1.1164 -30.1428 0.5582 -5.0238 1.1164 -20.0952 0.5582 -13.955 -9.4894 17.8624 -1.1164 1.1164 -9.4894 3.9074 2.2328 3.9074 11.164 4.4656 13.955 -29.5846 7.2566 14.5132 1.1164 -22.328 2.2328 11.7222 6.6984 -1.1164 26.2354 -6.1402 -16.1878 10.6058 13.955 1.6746 3.9074 1.6746 -7.2566 -13.955 15.0714 15.6296 -2.2328 16.1878 -5.582 5.582 -7.2566 -3.3492 -34.0502 -0.5582 32.9338 -6.1402 10.6058 13.3968 -27.91 8.373 -11.164 -3.9074 7.8148 15.0714 14.5132 -2.2328 -13.955 27.3518 54.7036 23.4444 8.373 -42.4232 -22.8862 2.2328 39.074 -18.4206 2.791 -19.537 10.6058 19.537 -16.1878 6.1402 2.2328 15.6296 8.373 20.0952 17.8624 -6.6984 1.1164 3.3492 -1.6746 17.8624 -2.791 -4.4656 -9.4894 3.3492 13.3968 29.5846 30.1428 8.9312 -2.2328 3.9074 -0.5582 2.2328 17.3042 -8.9312 34.0502 29.0264 1.1164 4.4656 2.791 12.2804 5.582 11.7222 6.6984 -14.5132 6.6984 -2.791 -9.4894 7.8148 0.5582 18.9788 5.0238 -17.8624 27.3518 -2.2328 -2.2328 35.1666 32.3756 -16.746 32.9338 -10.6058 17.8624 40.1904 1.6746 -4.4656 0.5582 5.582 -1.1164 12.2804 -0 10.0476 12.2804 20.6534 -5.582 1.1164 -21.7698 18.9788 25.6772 -5.582 -45.2142 24.5608 6.1402 -5.582 10.6058 -5.0238 11.7222 3.9074 3.3492 -46.8888 -7.2566 7.8148 6.6984 2.2328 -14.5132 -4.4656 -1.6746 20.0952 30.701 -1.1164 -1.6746 21.2116 42.9814 -17.8624 8.373 -3.3492 1.1164 37.3994 -9.4894 13.3968 18.9788 19.537 3.3492 6.6984 -27.3518 28.4682 6.6984 36.283 7.8148 -11.7222 -13.955 -21.2116 5.0238 7.2566 20.6534 -20.0952 19.537 -5.582 -6.6984 8.373 3.3492 18.9788 -19.537 -2.791 16.1878 0.5582 -16.1878 8.9312 -11.164 18.4206 1.1164 1.6746 -36.8412 5.0238 27.3518 -1.1164 -11.7222 -5.582 -16.746 -10.0476 -18.4206 10.6058 24.0026 1.1164 -7.2566 -1.1164 -27.3518 21.7698 3.9074 28.4682 -22.8862 -16.1878 30.701 -31.2592 8.373 -17.8624 -5.0238 -13.955 12.2804 -3.9074 -3.9074 10.6058 -2.2328 -39.6322 -21.2116 4.4656 29.0264 5.0238 3.3492 3.3492 -10.6058 -0 2.791 -11.164 16.746 21.2116 3.3492 0.5582 -12.2804 4.4656 5.0238 -0.5582 17.8624 15.6296 -32.3756 -11.164 -8.373 -11.7222 37.9576 16.1878 -12.8386 -28.4682 11.7222 -15.6296 2.791 7.2566 -0 -5.582 -1.1164 3.3492 -0.5582 -11.7222 -36.8412 3.9074 5.582 -16.1878 11.164 1.6746 -30.1428 9.4894 -3.3492 6.1402 -17.8624 -35.1666 26.2354 -15.0714 6.1402 1.1164 24.5608 16.746 -3.9074 30.1428 6.1402 11.164 25.119 -5.0238 2.2328 -3.9074 5.582 -20.0952
    -2.23069 -3.90371 1.67302 -9.48043 -0 -4.46138 4.46138 6.13439 27.8836 -16.7302 -19.5185 1.67302 20.0762 -1.67302 5.57672 3.34603 -5.57672 3.34603 10.0381 -3.34603 -6.13439 -8.36508 -0.557672 -1.67302 -23.9799 7.80741 8.36508 10.0381 16.1725 34.5757 5.57672 2.78836 -0.557672 11.7111 26.2106 18.4032 -12.2688 -12.8265 0.557672 -15.6148 -4.46138 17.2878 -2.78836 -37.364 0.557672 12.8265 3.90371 -7.80741 30.672 -6.13439 3.34603 2.23069 21.1915 -2.23069 -25.0953 -3.90371 -11.7111 -7.80741 -16.7302 -2.78836 -20.6339 2.23069 -12.2688 -16.1725 0.557672 23.9799 -1.11534 -19.5185 -6.13439 2.23069 -13.9418 -14.4995 -4.46138 -14.4995 2.78836 -15.6148 4.46138 20.6339 -5.01905 2.23069 -5.01905 -6.13439 1.67302 -17.8455 30.1143 -1.67302 -3.90371 23.4222 23.4222 -21.7492 3.34603 -17.8455 -56.8826 -1.11534 8.92276 -13.3841 3.34603 17.2878 -4.46138 36.8064 -9.48043 3.34603 11.1534 -0 -0.557672 -8.36508 0.557672 3.34603 5.01905 27.3259 3.90371 -9.48043 -6.69207 -13.3841 -1.67302 -19.5185 -6.69207 20.0762 11.7111 10.0381 -3.34603 22.8646 -1.67302 8.36508 -2.23069 -3.34603 -1.67302 -13.3841 -1.11534 -2.23069 2.23069 38.4794 -11.7111 11.1534 8.92276 2.23069 -17.2878 -3.34603 11.1534 0 16.7302 3.34603 10.0381 2.78836 8.92276 7.80741 1.67302 -0 15.6148 -1.11534 28.999 10.5958 6.69207 9.48043 -17.2878 20.0762 -6.13439 -8.36508 5.01905 15.0572 -13.3841 -3.34603 -2.78836 0.557672 10.5958 -5.57672 -12.8265 5.01905 16.7302 -3.34603 27.8836 -13.9418 13.3841 -1.67302 -11.1534 25.6529 -12.8265 28.4413 -6.13439 -5.57672 -13.3841 55.2096 20.6339 13.3841 7.24974 -2.78836 -3.34603 -25.6529 11.1534 -13.9418 -8.36508 -10.0381 -3.34603 24.5376 49.6328 -21.7492 -6.13439 6.69207 -16.7302 26.2106 -1.11534 2.78836 -3.34603 -6.13439 5.57672 9.48043 2.78836 -16.1725 -6.13439 -27.8836 9.48043 -1.67302 2.78836 -3.34603 1.11534 -1.11534 -37.9217 -5.01905 10.5958 6.69207 3.34603 11.7111 14.4995 1.67302 40.1524 -12.8265 14.4995 33.4603 6.69207 -18.4032 -6.13439 11.7111 -0.557672 14.4995 1.67302 -2.23069 8.92276 2.23069 -0.557672 12.2688 0 -1.11534 -5.57672 1.67302 -9.48043 -5.01905 6.13439 13.9418 -1.11534 -2.23069 15.6148 -3.90371 -9.48043 -3.34603 -11.1534 -1.67302 -12.2688 -6.13439 -6.13439 4.46138 8.92276 -14.4995 -9.48043 2.23069 6.69207 2.78836 -6.13439 -0 -26.7683 15.6148 -5.01905 11.7111 -12.8265 14.4995 0.557672 -21.7492 6.13439 -8.92276 4.46138 0.557672 7.24974 6.69207 -11.7111 14.4995 -1.67302 -0.557672 -3.90371 10.0381 -6.69207 -17.8455 1.11534 1.11534 -18.4032 11.1534 12.8265 5.01905 -3.34603 -2.23069 7.24974 3.34603 6.69207 25.0953 32.345 1.67302 9.48043 -0 -3.34603 -2.78836 -11.1534 34.018 -0.557672 23.9799 15.6148 -5.01905 3.34603 -18.9609 -6.13439 -12.2688 -2.23069 0.557672 34.018 -2.23069 3.34603 -2.23069 6.69207 -8.92276 16.1725 22.3069 8.92276 29.5566 -5.57672 19.5185 -18.4032 3.90371 1.11534 10.5958 -1.11534 -0.557672 -20.6339 3.90371 -0.557672 -0.557672 -3.34603 -17.2878 -2.78836 -1.67302 -1.67302 4.46138 0.557672 16.7302 17.2878 -1.67302 -10.0381 6.69207 -4.46138 16.1725 -15.0572 27.8836 7.24974 2.23069 11.7111 2.23069 -6.13439 26.2106 1.11534 8.92276 -1.11534 3.90371 8.36508 -40.1524 -1.11534 -17.8455 7.24974 30.1143 0.557672 -1.11534 21.1915 15.0572 5.57672 -10.5958 -24.5376 -1.67302 4.46138 -26.7683 11.7111 5.01905 0 -11.7111 17.8455 -11.1534 -36.2487 2.23069 11.7111 5.01905 15.6148 13.9418 -5.01905 2.23069 11.1534 12.2688 2.23069 -7.24974 15.6148 -2.23069 -20.6339 -1.11534 4.46138 3.34603 6.13439 5.57672 7.24974 -1.11534 15.6148 -16.7302 -16.7302 28.999 16.1725 7.24974 45.1715 -11.7111 13.9418 -8.92276 -20.0762 -32.9027 9.48043 -20.6339 10.0381 1.67302 -7.80741 -0 -0.557672 16.7302 -2.23069 -2.78836 -13.9418 35.691 -4.46138 27.8836 -10.5958 -25.6529 28.4413 -2.78836 6.69207 -8.36508 -12.8265 -6.13439 19.5185 21.7492 -17.2878 -9.48043 -18.4032 -8.92276 9.48043 16.1725 9.48043 -1.67302 31.7873 -12.8265 6.69207 0 8.92276 -0.557672 11.1534 2.78836 27.8836 16.7302 11.1534 -27.3259 20.6339 5.01905 -32.345 -16.7302 -18.9609 0 -2.78836 26.7683 6.13439 23.4222 14.4995 -10.5958 5.01905 8.36508 0 11.7111 7.24974 -7.24974 8.36508 6.69207 2.23069 0.557672 -13.9418 -7.24974 1.11534 -5.01905 6.13439 30.1143 14.4995 2.78836 9.48043 5.01905 -18.9609 -7.80741 4.46138 10.5958 -12.2688 2.23069 -0 45.7291 26.7683 7.80741 11.7111 -6.13439
    XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

     this is signed char feature_map (the same five pictures) which get on AM5728.

    feature_test_board_char.txt
    Fullscreen
    1
    2
    3
    4
    5
    0 226 1 237 0 252 5 3 24 252 225 0 4 253 4 19 1 10 253 227 243 246 254 231 3 0 5 19 9 33 5 254 6 252 18 17 239 253 245 240 248 5 1 241 18 11 2 254 19 5 19 6 19 251 246 255 8 245 219 245 252 7 6 244 243 250 1 11 251 235 3 3 245 221 6 245 4 41 252 247 253 249 6 228 23 232 0 251 27 224 254 229 209 237 10 234 252 26 24 36 235 16 28 19 244 5 255 9 2 55 12 254 255 3 248 229 248 10 4 13 252 18 2 2 248 1 249 242 245 254 6 58 36 13 11 8 225 2 6 0 16 1 25 4 13 254 10 4 41 0 41 240 251 0 241 246 250 237 253 13 9 252 251 1 7 241 3 7 4 246 8 250 246 255 4 28 241 17 244 246 217 34 32 8 4 6 254 246 4 13 251 245 255 18 33 233 0 0 30 37 255 8 250 3 250 254 255 246 248 220 8 250 255 250 0 2 225 1 3 5 254 8 20 245 32 246 29 19 8 13 239 7 255 4 249 254 9 254 8 6 0 3 251 0 246 248 7 5 7 253 12 9 238 253 235 3 247 239 252 9 7 238 11 245 247 0 252 2 238 10 3 19 5 33 6 248 252 242 2 248 1 1 253 9 253 253 253 246 4 249 255 254 241 4 7 14 6 3 248 5 10 35 28 255 17 2 6 242 234 11 0 30 24 254 2 240 253 246 253 1 23 0 0 13 10 247 254 13 23 15 238 33 227 5 6 3 2 7 238 11 252 251 250 249 247 12 255 252 230 2 12 17 243 255 254 17 239 9 3 3 253 7 249 19 253 2 253 11 12 235 0 240 255 18 2 5 7 22 8 233 3 255 255 224 21 250 253 234 6 7 245 250 9 9 4 254 251 22 231 25 251 243 13 232 230 250 0 2 251 8 237 5 9 243 247 22 36 254 25 1 252 246 239 226 6 248 11 254 253 14 6 255 19 5 2 15 254 19 251 233 14 0 6 247 243 0 25 39 246 252 231 231 3 12 11 254 26 247 0 1 253 6 2 251 11 4 253 215 251 11 220 242 254 253 254 43 0 253 5 222 7 6 0 3 250 255 243 245 6 241 0 245 8 253 18 32 16 247 3 6 209 243 255 242 3 3 253 55 19 5 13 252
    247 238 243 12 241 251 27 254 8 10 7 2 1 246 0 246 2 4 11 240 248 251 0 223 14 255 12 5 18 251 10 255 24 218 254 11 250 2 236 243 236 26 254 3 247 1 3 3 220 1 9 253 3 248 22 244 23 39 247 5 247 27 249 6 247 233 8 251 248 245 4 7 253 240 0 254 0 14 253 252 2 225 17 242 16 248 2 219 249 224 253 231 254 8 2 241 244 2 1 237 251 250 2 12 242 12 250 7 11 36 16 251 0 8 254 2 1 250 1 0 251 19 2 4 251 244 253 25 248 3 252 36 227 8 241 4 227 254 238 15 242 240 9 250 246 252 239 3 4 5 248 253 253 232 232 211 248 3 3 25 253 26 248 254 11 253 9 1 253 0 228 25 237 255 18 23 246 5 233 2 196 5 221 11 241 248 11 7 8 249 38 5 7 0 41 253 21 12 51 55 1 244 245 5 5 246 255 8 252 247 14 240 3 4 0 242 250 15 250 4 253 250 251 249 255 212 254 0 254 247 254 1 4 8 243 4 23 0 232 249 8 243 7 0 249 8 19 11 255 8 3 2 17 3 248 250 255 4 234 248 1 236 229 16 252 10 236 2 227 248 15 20 6 244 5 9 29 13 248 254 2 45 241 28 253 10 255 244 235 250 5 22 31 9 3 231 245 0 3 1 14 16 30 247 248 2 19 247 6 5 0 10 23 250 3 222 1 0 251 2 237 8 3 0 2 10 11 18 11 26 0 248 3 239 249 219 255 7 4 7 18 245 0 8 244 5 0 245 246 5 11 2 230 0 254 23 248 225 247 0 11 255 1 254 246 253 251 11 254 10 40 245 244 11 14 254 235 32 7 216 4 5 251 251 1 244 245 8 248 39 225 251 252 14 10 237 1 21 220 7 241 6 10 240 252 249 252 254 247 15 7 253 15 10 247 253 9 2 12 3 6 251 240 252 28 8 250 2 234 12 5 11 248 9 29 241 253 237 243 254 201 249 7 239 245 1 5 10 16 245 8 246 8 248 31 251 31 242 243 0 235 247 237 250 35 0 237 248 241 254 14 30 1 20 7 254 230 228 233 229 4 2 253 231 250 9 3 248 5 239 22 10 2 20 27 8 249 240 1 251 29 10 249 251 31 24 249 243 238 7 255 252
    1 234 0 241 0 254 4 3 15 254 233 0 4 253 3 13 0 4 254 236 245 247 255 241 2 0 5 13 7 25 4 255 3 253 12 11 244 253 248 246 250 4 1 246 12 6 1 254 13 4 13 4 13 253 250 254 4 251 230 248 252 4 2 248 248 252 1 10 253 241 2 2 249 232 4 249 3 29 253 250 254 251 4 237 17 240 0 252 20 233 254 238 222 243 8 241 254 19 18 24 242 8 20 12 248 4 255 7 3 40 8 255 255 3 249 237 250 6 3 8 253 14 1 2 250 1 251 246 249 255 4 42 25 8 8 6 234 2 5 253 12 0 18 3 10 255 8 3 27 1 29 244 252 0 244 249 249 243 254 9 5 254 253 1 3 246 2 5 3 249 6 251 250 255 3 20 246 12 248 249 230 25 25 4 1 5 254 250 3 8 253 249 254 12 23 240 1 0 18 25 255 6 251 2 251 254 0 250 249 231 6 252 0 252 0 1 234 1 2 4 254 5 14 249 22 248 21 15 6 7 246 6 255 3 251 254 8 254 5 3 255 2 252 0 249 250 3 5 5 253 7 5 243 254 242 2 249 243 254 7 5 243 8 249 249 0 252 2 243 7 0 15 3 25 5 250 252 245 1 250 0 1 255 5 254 254 253 249 3 253 255 0 245 3 5 12 5 1 251 3 7 25 20 255 12 1 5 246 242 7 255 21 17 255 3 245 253 250 253 1 14 0 1 10 7 251 255 10 16 10 243 23 237 4 3 1 1 5 241 9 253 254 252 251 250 9 255 254 237 2 9 11 249 255 254 11 243 8 3 2 255 6 251 14 255 1 253 7 5 241 254 244 255 11 2 4 5 13 6 241 2 255 0 232 15 252 0 241 5 4 248 253 7 7 4 254 251 16 236 17 253 247 10 238 237 252 0 2 252 5 244 4 6 248 250 17 24 255 18 1 253 250 241 234 5 249 10 255 254 8 4 1 13 2 0 10 254 13 251 241 12 0 4 249 247 0 18 26 249 254 238 237 3 10 7 255 17 249 0 1 254 3 1 253 7 3 254 229 254 8 231 246 255 255 254 30 1 253 4 233 6 4 0 2 252 255 246 248 4 246 1 249 6 254 13 23 13 250 3 4 223 246 0 247 1 2 255 39 12 4 9 253
    0 254 253 21 251 239 247 252 2 13 248 3 4 248 252 1 250 7 41 246 0 244 253 221 7 254 239 17 251 243 254 255 20 3 10 5 242 5 251 248 0 248 22 241 14 1 254 255 229 253 7 2 15 242 252 241 21 255 24 253 2 16 254 11 251 244 4 21 7 254 8 10 241 250 2 252 9 8 3 1 255 245 12 246 11 239 7 232 250 215 255 228 15 244 10 2 251 247 250 253 247 2 19 12 240 11 255 2 254 2 9 1 250 252 251 250 254 4 4 1 249 0 2 3 237 235 0 240 248 0 3 8 8 5 245 5 242 255 34 28 247 254 4 2 243 5 8 252 14 7 8 238 253 248 241 243 6 2 252 34 9 11 246 3 9 0 5 5 14 232 244 5 255 0 249 242 1 2 252 254 226 255 1 237 26 248 252 255 3 7 25 252 6 30 1 252 254 5 37 245 11 0 5 8 9 250 252 225 254 231 0 249 251 231 255 1 247 16 1 1 236 254 6 12 17 11 22 249 5 24 0 247 7 5 254 1 11 250 255 254 251 2 4 0 246 253 8 27 252 17 4 8 247 4 221 253 11 240 255 255 235 2 254 7 253 9 18 254 236 24 45 20 254 245 247 15 25 234 0 246 9 29 237 11 0 14 10 9 13 247 253 254 254 15 6 244 253 4 252 13 17 0 21 255 253 4 14 239 16 21 0 19 13 4 2 5 1 238 4 2 10 5 255 14 1 241 8 1 13 20 13 234 7 253 13 1 1 4 17 5 253 254 2 3 8 27 4 3 248 6 8 8 225 13 8 234 3 244 0 0 237 220 254 11 0 252 241 2 0 32 25 248 5 8 26 238 5 3 5 0 4 13 254 8 11 248 239 0 252 30 10 250 245 235 255 243 14 17 250 2 251 253 5 255 239 249 3 254 242 6 2 8 6 7 242 0 27 250 5 254 240 239 233 4 5 245 254 1 250 7 3 249 250 253 24 235 6 237 243 247 0 1 10 6 250 239 246 14 4 5 254 2 254 255 252 250 255 253 2 0 239 4 254 0 7 6 226 237 247 2 29 10 13 249 10 0 253 245 246 244 255 246 0 240 236 6 11 252 9 244 14 9 250 3 25 250 8 240 247 1 0 13 249 8 248 15 254 250 252 247 251 249
    1 215 1 226 255 250 9 6 30 253 211 0 7 250 4 24 0 7 252 217 235 240 253 224 3 255 10 26 14 48 9 254 8 250 25 20 232 251 239 235 246 8 1 235 24 13 3 252 25 8 25 9 25 250 243 253 10 246 206 240 248 8 4 240 241 248 2 19 249 227 2 3 240 208 9 241 5 57 251 244 251 249 8 219 34 226 0 250 38 212 253 222 187 231 14 224 252 36 35 46 228 16 40 25 242 7 255 13 5 82 15 252 0 5 243 215 245 14 6 17 250 27 2 4 243 2 246 237 245 253 8 81 48 16 17 11 213 4 10 250 26 1 36 5 20 253 15 5 53 2 60 234 248 0 232 243 242 233 251 18 10 251 250 2 7 235 5 9 7 244 11 247 242 255 6 38 236 25 241 242 204 50 49 8 3 9 254 245 6 17 248 242 254 22 48 226 3 1 36 51 254 12 247 3 246 255 1 243 243 207 11 248 255 248 1 3 213 2 4 9 253 9 27 241 43 241 40 30 12 14 236 11 255 6 247 253 14 253 13 5 255 3 248 0 243 243 8 10 9 250 15 11 231 252 230 2 242 231 252 17 11 229 14 243 241 255 248 3 230 15 0 30 6 52 9 245 249 235 2 242 0 3 0 12 252 251 251 242 5 249 254 255 234 6 9 22 9 3 248 5 13 47 40 255 24 3 7 236 229 15 255 40 33 253 4 234 249 246 251 2 31 255 255 18 14 245 255 21 32 21 231 46 217 7 7 1 3 8 227 16 250 252 248 247 244 17 255 253 220 4 18 22 241 253 252 22 232 15 5 5 252 9 247 27 253 3 250 13 10 227 254 234 255 22 3 9 9 24 12 225 4 254 1 207 29 249 254 226 11 7 239 251 12 12 7 254 248 30 218 35 252 238 19 222 218 248 0 4 248 9 233 8 11 242 244 34 47 254 35 3 250 245 227 211 11 244 19 254 251 15 9 3 27 5 255 20 253 26 245 226 24 0 7 239 239 0 35 49 242 253 221 223 5 20 14 254 34 242 1 1 254 7 2 250 15 7 253 203 255 16 207 237 0 254 250 60 3 250 8 212 12 9 255 3 248 255 235 240 8 236 3 240 11 255 26 47 24 245 5 9 191 236 0 237 1 4 253 77 25 7 18 249
    XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

    even I divide the scale factor 2.3593  , the result is not corresponding, so please illustrate how to convert the feature_map by  giving an example.

    thanks

    best regards

  • Can you check the similar matching for input and first layer(Trace 0 and trace 1) to make sure your input and first convolution is matching.
  • thanks for your help, I have solved my question.
    I use the signed char* to point the output buffer, and the result is reasonable.
  • Thanks for confirmimg. Closing the thread.