Part Number: PROCESSOR-SDK-AM62A
Hi TI Experts,
I am trying to compile the original DeeplabV3_resnet101 from pytorch after converting it to onnx format with opset version 11 and getting the error in the attached log. Could you please help out in solving the same.
Regards,
-Abhy
abhy@JPN-1CZ247006Z:~/edgeai-tidl-tools/examples/osrt_python/ort$ python onnxrt_ep.py -c
Available execution providers : ['TIDLExecutionProvider', 'TIDLCompilationProvider', 'CPUExecutionProvider']
Running 1 Models - ['ss-ort-deeplabv3_v11']
Running_Model : ss-ort-deeplabv3_v11
Running shape inference on model ../../../models/public/DeeplabV3_v11.onnx
Process Process-1:
Traceback (most recent call last):
File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
self.run()
File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/home/abhy/edgeai-tidl-tools/examples/osrt_python/ort/onnxrt_ep.py", line 190, in run_model
sess = rt.InferenceSession(config['model_path'] ,providers=EP_list, provider_options=[delegate_options, {}], sess_options=so)
File "/home/abhy/.local/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 283, in __init__
self._create_inference_session(providers, provider_options)
File "/home/abhy/.local/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 310, in _create_inference_session
sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from ../../../models/public/DeeplabV3_v11.onnx failed:/home/a0496663/work/edgeaitidltools/rel90/onnx/onnxruntime_bit/onnxruntime/onnxruntime/core/graph/model.cc:111 onnxruntime::Model::Model(onnx::ModelProto&&, const PathString&, const IOnnxRuntimeOpSchemaRegistryList*, const onnxruntime::logging::Logger&) Unknown model file format version.


