Hi all,
I have used the docker based setup in linux pc and tried to compile the yolov5l model provided by TI model zoo. I am successfully able to compile the model and also I am able to run the model using TIDL artifacts inside docker based setup for am69a device using my linux pc.
Now I copied the artifact folder onto cloud and tried to run the model using these artifacts on the device am69a using cloud service.
This is the main part of code I am running-
import onnxruntime as rt onnx_model_path = '/home/root/notebooks/custom_models/yolov5l6_640_ti_lite_47p1_65p6.onnx' delegate_options = {} so = rt.SessionOptions() delegate_options['artifacts_folder'] = '/home/root/notebooks/custom-artifacts/yolov5l/' delegate_options.update(optional_options) EP_list = ['TIDLExecutionProvider','CPUExecutionProvider'] sess = rt.InferenceSession(onnx_model_path ,providers=EP_list, provider_options=[delegate_options, {}], sess_options=so) input_details = sess.get_inputs() output_details = sess.get_outputs()
I am getting these errors when I am trying to run the model using onnxrt env-
RuntimeErrorTraceback (most recent call last) /usr/local/lib/python3.6/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py in __init__(self, path_or_bytes, sess_options, providers, provider_options) 282 try: --> 283 self._create_inference_session(providers, provider_options) 284 except RuntimeError: /usr/local/lib/python3.6/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py in _create_inference_session(self, providers, provider_options) 314 # initialize the C++ InferenceSession --> 315 sess.initialize_session(providers, provider_options) 316 RuntimeError: std::exception During handling of the above exception, another exception occurred: AttributeErrorTraceback (most recent call last) <ipython-input-9-4e829141b539> in <module> 1 EP_list = ['TIDLExecutionProvider','CPUExecutionProvider'] ----> 2 sess = rt.InferenceSession(onnx_model_path ,providers=EP_list, provider_options=[delegate_options, {}], sess_options=so) 3 4 input_details = sess.get_inputs() 5 output_details = sess.get_outputs() /usr/local/lib/python3.6/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py in __init__(self, path_or_bytes, sess_options, providers, provider_options) 284 except RuntimeError: 285 if self._enable_fallback: --> 286 print("EP Error using {}".format(self._providers)) 287 print("Falling back to {} and retrying.".format(self._fallback_providers)) 288 self._create_inference_session(self._fallback_providers, None) AttributeError: 'InferenceSession' object has no attribute '_providers'
The model I picked is from here https://github.com/TexasInstruments/edgeai-yolov5/blob/master/pretrained_models/models/detection/coco/edgeai-yolov5/yolov5l6_640_ti_lite_47p1_65p6.onnx.link
Thanks
Akhilesh