This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

run python3 onnxrt_ep.py -d output error

Traceback (most recent call last): File "onnxrt_ep.py", line 295, in run_model(model, mIdx) File "onnxrt_ep.py", line 190, in run_model sess = rt.InferenceSession(config['model_path'] , providers=EP_list,sess_options=so) File "/usr/local/lib/python3.6/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 283, in __init__ self._create_inference_session(providers, provider_options) File "/usr/local/lib/python3.6/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 315, in _create_inference_session sess.initialize_session(providers, provider_options) TypeError: initialize_session(): incompatible function arguments. The following argument types are supported: 1. (self: onnxruntime.capi.onnxruntime_pybind11_state.InferenceSession, arg0: List[str], arg1: List[Dict[str, str]], arg2: Set[str]) -> None Invoked with: , ['CPUExecutionProvider'], [{}]

if args.disable_offload :
EP_list = ['CPUExecutionProvider']
pdb.set_trace()
sess = rt.InferenceSession(config['model_path'] , providers=EP_list,sess_options=so)
#pdb.set_trace()
elif args.compile:
EP_list = ['TIDLCompilationProvider','CPUExecutionProvider']
sess = rt.InferenceSession(config['model_path'] ,providers=EP_list, provider_options=[delegate_options, {}], sess_options=so)
else:
EP_list = ['TIDLExecutionProvider','CPUExecutionProvider']
sess = rt.InferenceSession(config['model_path'] ,providers=EP_list, provider_options=[delegate_options, {}], sess_options=so)