Tool/software:
I am planning to develop for AM62Ax, and now I'm running the Model Analyzer.
I tried to compile the custom model. The model I'm currently using is "face_detection_front_128_integer_quant.tflite", which can be obtained from following URL:
https://s3.ap-northeast-2.wasabisys.com/pinto-model-zoo/030_BlazeFace/resources.tar.gz
I have checked that this model works fine on the CPU. I have tried compiling it several times based on the example notebook, but the compilation failed with following errors:
--------------------------------------------------------------------------- ValueError Traceback (most recent call last) Cell In[3], line 1 ----> 1 interpreter = tflite.Interpreter(model_path=tflite_model_path, experimental_delegates=tidl_delegate) 2 #interpreter = tflite.Interpreter(model_path=tflite_model_path) File /usr/local/lib/python3.10/dist-packages/tflite_runtime/interpreter.py:489, in Interpreter.__init__(self, model_path, model_content, experimental_delegates, num_threads, experimental_op_resolver_type, experimental_preserve_all_tensors) 487 self._delegates = experimental_delegates 488 for delegate in self._delegates: --> 489 self._interpreter.ModifyGraphWithDelegate( 490 delegate._get_native_delegate_pointer()) # pylint: disable=protected-access 491 #self._signature_defs = self.get_signature_list() #PC-- commented for now. Workaround. Needs to be added to interpreter_wrapper2 493 self._metrics = metrics.TFLiteMetrics() ValueError: basic_string::_M_create
or
--------------------------------------------------------------------------- MemoryError Traceback (most recent call last) Cell In[6], line 1 ----> 1 interpreter = tflite.Interpreter(model_path=tflite_model_path, experimental_delegates=tidl_delegate) 2 #interpreter = tflite.Interpreter(model_path=tflite_model_path) File /usr/local/lib/python3.10/dist-packages/tflite_runtime/interpreter.py:489, in Interpreter.__init__(self, model_path, model_content, experimental_delegates, num_threads, experimental_op_resolver_type, experimental_preserve_all_tensors) 487 self._delegates = experimental_delegates 488 for delegate in self._delegates: --> 489 self._interpreter.ModifyGraphWithDelegate( 490 delegate._get_native_delegate_pointer()) # pylint: disable=protected-access 491 #self._signature_defs = self.get_signature_list() #PC-- commented for now. Workaround. Needs to be added to interpreter_wrapper2 493 self._metrics = metrics.TFLiteMetrics() MemoryError: std::bad_alloc
log:74862.custon-model-tfl_out.log
On the other hand, I was able to compile and run this model using EdgeAI-TIDL-tools 10_01 on Ubuntu 22.04.
Why does changing EdgeAI-tools to a newer one make the compilation successful? (I recognize that Model Analyzer use EdgeAI-tools 09_02 or 10_00 because SDK ver. is 9.2)
Is there any way to succeed the compilation of this models on the Model Analyzer?
Python code:
import sys import time import os import cv2 import numpy as np import tflite_runtime.interpreter as tflite from utils import loggerWriter, plot_TI_performance_data, get_benchmark_output from PIL import Image import matplotlib.pyplot as plt output_dir = 'face_detection_quant' tflite_model_path = './face_detection_front_128_integer_quant.tflite' debug_level = 0 num_bits = 8 accuracy = 1 compile_options = { 'tidl_tools_path' : os.environ['TIDL_TOOLS_PATH'], 'artifacts_folder' : output_dir, 'tensor_bits' : num_bits, 'accuracy_level' : accuracy, 'debug_level' : debug_level, 'advanced_options:calibration_frames' : 1, 'advanced_options:calibration_iterations' : 3, 'advanced_options:add_data_convert_ops' : 1, } os.makedirs(output_dir, exist_ok=True) for root, dirs, files in os.walk(output_dir, topdown=False): [os.remove(os.path.join(root, f)) for f in files] [os.rmdir(os.path.join(root, d)) for d in dirs] tidl_delegate = [tflite.load_delegate(os.path.join(os.environ['TIDL_TOOLS_PATH'], 'tidl_model_import_tflite.so'), compile_options)] interpreter = tflite.Interpreter(model_path=tflite_model_path, experimental_delegates=tidl_delegate)
Thanks
Fumiya