Tool/software:
Hello,
I am trying to perform inference on a TI SoC using the OSRT approach with a YOLOX model. I have used a model from yolox-ti-edgeai, and I have also tried models from the model zoo. I could compile the ONNX models using edgeai-tidl-tools on my host PC. However, when I execute the models on the target, I got the following error:
2024-06-11 16:15:49.917614036 [E:onnxruntime:, sequential_executor.cc:494 ExecuteKernel] Non-zero status code returned while running TIDL_0 node. Name:'TIDLExecutionProvider_TIDL_0_0' Status Message: /onnx/onnxruntime/onnxruntime/core/framework/execution_frame.cc:170 onnxruntime::common::Status onnxruntime::IExecutionFrame::GetOrCreateNodeOutputMLValue(int, int, const onnxruntime::TensorShape*, OrtValue*&, const onnxruntime::Node&) shape && tensor.Shape() == *shape was false. OrtValue shape verification failed. Current shape:{1,3549,6} Requested shape:{1,1,1,1,3549,6}
TIDL_RT_OVX: ERROR: Verifying TIDL graph ... Failed !!!
TIDL_RT_OVX: ERROR: Verify OpenVX graph failed
************ TIDL_subgraphRtCreate done ************
[ERROR] Unable to run session: Non-zero status code returned while running TIDL_0 node. Name:'TIDLExecutionProvider_TIDL_0_0' Status Message: /onnx/onnxruntime/onnxruntime/core/framework/execution_frame.cc:170 onnxruntime::common::Status onnxruntime::IExecutionFrame::GetOrCreateNodeOutputMLValue(int, int, const onnxruntime::TensorShape*, OrtValue*&, const onnxruntime::Node&) shape && tensor.Shape() == *shape was false. OrtValue shape verification failed. Current shape:{1,3549,6} Requested shape:{1,1,1,1,3549,6}
Could someone help by mentioning the potential issues or steps to troubleshoot this problem?
Thanks