Tool/software:
I'm trying to run a custom ONNX model with a program built from adapting the examples available at https://github.com/TexasInstruments/edgeai-tidl-tools/tree/master/examples/osrt_cpp/ort.
This runs fine with no acceleration ("-a 0"), but when running with "-a 1" it throws the following exception:
"
terminate called after throwing an instance of 'Ort::Exception'
what(): /root/onnxruntime/onnxruntime/core/providers/tidl/tidl_execution_provider.cc:94 onnxruntime::TidlExecutionProvider::TidlExecutionProvider(const onnxruntime::TidlExecutionProviderInfo&) status == true was false.
Aborted (core dumped)
"
This is also the same behavior I encountered when running this option in x86 (I think I didn't provide the flag and it just defaulted to 1).
I tried also exported this variable before running the application:export TIDL_RT_ONNX_VARDIM=1
I was also able to run this model using the Python API, with acceleration.