This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TDA4VH-Q1: Error reporting issues when deploying onnx model on TDA4VH

Part Number: TDA4VH-Q1

According GitHub - TexasInstruments/edgeai-tidl-tools: Edgeai TIDL Tools and Examples - This repository contains Tools and example developed for Deep learning runtime (DLRT) offering provided by TI’s edge AI solutions.

the official document was used for installation, but the following issues were encountered during the installation process:

Please refer to the attachment for installation and detailed execution steps: output_log.txt

The following is our operating environment:

  Question 1:Under edgeai-tidl-tools$ when we run source ./scripts/run_python_examples.sh : 

google.protobuf.message.DecodeError: Error parsing message with type 'onnx.ModelProto'

and onnxruntime.capi.onnxruntime_pybind11_state.InvalidProtobuf: [ONNXRuntimeError] : 7 : INVALID_PROTOBUF : Load model from ../../../models/public/resnet18_opset9.onnx failed:Protobuf parsing failed.

and Error : Error Code = <ERR_UNSUPPORTED_DATA_TYPE>

and Could not open /home/ubuntu/Documents/edgeai-tidl-tools/edgeai-tidl-tools/model-artifacts/cl-dlr-tflite_inceptionnetv3/tempDir/subgraph0_net/perfSimInfo.bin   

Question 2:We copy:model-artifacts/models/to tda4VH-Q1 development board,when we run examples/osrt_python/ort/onnxrt_ep.py ,we encounter these mistake

 output_log(1).txt