This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Be applied the regression model on TIDL Library.

Hi,

I have some questions about applied TIDL library.

I applied a regression model on TIDL.

The output are quantized with TIDL library. 

However, I need to know the relationship between the TIDL results (quantized to 8-bits) and the caffe jacinto results (float opration) to predict.

Can you give me some advice to realize the relationship?

Ahan

 

  • Ahan,

      The output is 8 bit fixed point representation with the scaling factor. The 8 bit value can be divided by scaling factor to get the floating point value. The scaling factor is available in dataQ of (sTIDL_DataParams_t).

    Thanks and Regards,

    Kumar.D

  • Hi, Kumar

    Now I used TIDL sample model (jsegnet) to validate the scaleing factor.

     Layer    1 : Max PASS :    19211 :    15301 Out Q :      254 ,    43861, TIDL_BatchNormLayer, PASSED  #MMACs =     1.57,     0.00,     1.57, Sparsity :   0.00, 100.00
     Layer    2 : Max PASS :   106321 :   108432 Out Q :     7934 ,   108857, TIDL_ConvolutionLayer, PASSED  #MMACs =   314.57,   261.23,   278.92, Sparsity :  11.33,  16.96
     Layer    3 : Max PASS :    19248 :    22842 Out Q :     8500 ,    22932, TIDL_ConvolutionLayer, PASSED  #MMACs =   301.99,   257.43,   267.39, Sparsity :  11.46,  14.76

    1.  Can I think the "scaling factor" is the Out Q?

    2. If yes, for the jsegnet, the first layer is bias layer. The output were -128 to 127 (quantization to 8 bits), then, they divided by the scaling factor will be -0.5 to 0.5.

    But, the output on caffe jacinto was -128 to 127. How can I explain the result ?

    3. If not, can you tell me how to get the scaling factor ?

    Ahan

  • Hi Ahan,

     The Out Q (scale factor) is in Q8 fixed point format. This needs to be divided by 256 to get the scale factor in floating point.

    Thanks and Regrads,

    Kumar.D