This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

AM5729: MobileNetV2 convert via tidl_model_import.out and inference on BBAI

Part Number: AM5729

Hello,
First i tried to get the .tflite format pretrained model (on my windows pc) as follows:

import tensorflow as tf
mobilenet_model = tf.keras.applications.MobileNetV2()
converter = tf.lite.TFLiteConverter.from_keras_model(mobilenet_model)
converter.experimental_new_converter = True
tflite_model = converter.convert()
with open('model_new.tflite', 'wb') as f:
  f.write(tflite_model)

I then try to convert this model to tidl format (I install and convert with SDK: ti-processor-sdk-linux-am57xx-evm-06.03.00.106-Linux-x86-Install.bin on ubuntu virtual machine) as follows:

sudo "/home/nmvr/ti-processor-sdk-linux-am57xx-evm-06.03.00.106/linux-devkit/sysroots/x86_64-arago-linux/usr/bin/tidl_model_import.out" ./test/testvecs/config/import/tflite/tidl_import_mobileNetv2.txt

=============================== TIDL import - parsing ===============================

TFLite Model (Flatbuf) File  : ./test/testvecs/config/tflite_models/model_new.tflite  
TIDL Network File      : ./test/testvecs/config/tidl_models/tflite/tidl_net_tflite_mobilenet_v2_1.0_224_new.bin  
TIDL IO Info File      : ./test/testvecs/config/tidl_models/tflite/tidl_param_tflite_mobilenet_v2_1.0_224_new.bin  
TFLite node size: 69
Segmentation fault

The conversion ended with a segmentation fault…
I converted it via the original config file tidl_import_mobileNetv2.txt for mobilenetv2:

# Default - 0
randParams         = 0

# 0: Caffe, 1: TensorFlow, 2: ONNX, 3: TensorFlow Lite, Default - 0
modelType          = 3

# 0: Fixed quantization By tarininng Framework, 1: Dyanamic quantization by TIDL, Default - 1
quantizationStyle  = 1

# quantRoundAdd/100 will be added while rounding to integer, Default - 50
quantRoundAdd      = 50

numParamBits       = 12

inputNetFile       = "./test/testvecs/config/tflite_models/model_new.tflite
inputParamsFile    = "NA"
outputNetFile      = "./test/testvecs/config/tidl_models/tflite/tidl_net_tflite_mobilenet_v2_1.0_224_new.bin"
outputParamsFile   = "./test/testvecs/config/tidl_models/tflite/tidl_param_tflite_mobilenet_v2_1.0_224_new.bin"

inWidth  = 224
inHeight = 224
inNumChannels = 3
inElementType = 1
rawSampleInData = 1
sampleInData = "./test/testvecs/input/preproc_2_224x224.y"
tidlStatsTool = "/home/nmvr/ti-processor-sdk-linux-am57xx-evm-06.03.00.106/linux-devkit/sysroots/x86_64-arago-linux/usr/bin/eve_test_dl_algo_ref.out"

I tried Mobilenet_V2_1.0_224 tflite model downloaded from tensorflow.org and convert to tidl model in the same procedure. Command ran fine and two param and net .bin files were created for inference.

When I tried inference on BBAI devices through these two files, I found out that the model gives bad labels (first label must by 1 goldfish - When I try inference on tflite_runtime).

but result was follows:

sudo PYTHONPATH=/usr/share/ti/tidl/tidl_api python3 imagenet.py
Network needs 64.0 + 9.0 mb heap
Input: n01443537_11099_goldfish.jpg
Running network across 1 EVEs, 0 DSPs
TIDL API: performing one time initialization ...
TIDL API: processing 1 input frames ...
k_heap:  [(6, 116), (6, 563), (31, 612), (7, 795), (79, 724)]
k_sorted:  [(79, 724), (31, 612), (7, 795), (6, 563), (6, 116)]
output_array:  [0 0 2 ... 0 0 0]
1: pirate,   prob = 30.98%
2: jinrikisha,   prob = 12.16%

The inference was made using the python code: imagenet.py and original config file tidl_config_mobileNet2.txt for MobileNetV2:

numFrames   = 1
preProcType = 2
inData   = "preproc_2_224x224.y"
outData   = "stats_tool_out.bin"
netBinFile      = "./tidl_models_convert/tidl_net_tflite_mobilenet_v2_1.0_224.bin"
paramsBinFile   = "./tidl_models_convert/tidl_param_tflite_mobilenet_v2_1.0_224.bin"
inWidth = 224
inHeight = 224
inNumChannels = 3

Please help my:

  1. Why did my converted tflite model from windows end up with a segmentation fault?
  2. Why did the downloaded tflite model from the tensorflow site, which was successfully converted to tidl .bin files, had bad labels in inference?

Thanks