This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

CCS/PROCESSOR-SDK-DRA8X-TDA4X: tidlModelImport out put is not same

Part Number: PROCESSOR-SDK-DRA8X-TDA4X

Tool/software: Code Composer Studio

Hi :

      i    am   using   psdk_rtos_auto_j7_06_02_00_21   SDK .

     i   follow   ti_dl/docs/user_guide_html/md_tidl_user_model_deployment.html       Getting Started with TI Deep Learning (TIDL) ecosystem         turn    MobileNetV2 Tensorflow model     and   run   test  program  both     on   pc  and   target   ,     output  in      ti_dl\test\testvecs\output      is   right.

      nut       i    problem   is       i   run      ./out/tidl_model_import.out ${TIDL_INSTALL_PATH}/ti_dl/test/testvecs/config/import/public/tensorflow/tidl_import_mobileNetv2.txt --numParamBits 15    many  times  

     but   the   out put  model   is  not  same ,     out  reslut  of  ./PC_dsp_test_dl_algo.out     is  all  right.

Thanks!

Shuai

  • Hi Shuai.

    You issue description is not clear.

    Step 1 : Import

      ./out/tidl_model_import.out ${TIDL_INSTALL_PATH}/ti_dl/test/testvecs/config/import/public/tensorflow/tidl_import_mobileNetv2.txt --numParamBits 15

    Step - Inference on PC. -- Is this step working?

    /PC_dsp_test_dl_algo.out

    Steps 3: inference on EVM  -- are you observing issue in this step?

    /TI_DEVICE_dsp_test_dl_algo.out

  • Hi: kumar

                do     Step 1 : Import       get  output_model_1

                do     Step 1 : Import       again     get  output_model_2

                output_model_1   and   output_model_2   is   different,    i think   it   must   the  same.

               then   Step2 - Inference on PC.

                output_model_1   Step2 - Inference on PC.          get   result1

                output_model_2   Step2 - Inference on PC.          get   result2

               result1   and   result2   is   the  same.

              so    problem   is   why   output_model_1   and   output_model_2   is   different?

    Thanks!

    Shuai

  • Hi Shuai,

    The expectation is that inference result from multiple runs shall match. From your observation this expectation is met.

    Models files may be mismatching because of random initialization of unused parameters.