This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Unknown model file format version error for a valid onnx model file

Other Parts Discussed in Thread: TDA4VM

Hi,

I am using TDA4VM (J721E) board with SDK version 09_00_00_00.

While compiling a custom model (resnet 18 like), I got the following error.

sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from /home/root/examples/osrt_python/ort/model_r_18_edited.onnx failed:/home/a0496663/work/edgeaitidltools/rel90/onnx/onnxruntime_bit/onnxruntime/onnxruntime/core/graph/model.cc:111 onnxruntime::Model::Model(onnx::ModelProto&&, const PathString&, const IOnnxRuntimeOpSchemaRegistryList*, const onnxruntime::logging::Logger&) Unknown model file format version.

Firstly I had an onnx model, whose last layer was BatchNorm1d (after linear layer of Gemm in onnx). I tried to compile it, but Batchnorm1d is not supported (according to doc only batchnorm2d is supported). So it is not giving me output of required dimension (1x512, giving a single array value). I have onnx file only so I do not have liberty to change the architecture and change the shape of output of linear layer, to make it's output consumable by batchnorm2d which is supported for conversion. To solve this issue, I used onnx graph surgeon and removed the last batch norm layer and then tried to compile the model. While doing so I got the error mentioned above. The edited onnx model works fine in other systems with CPU Execution, so I feel there is some issue while we create an onnx runtime session using TIDLCompilationProvider.

Following is the google drive link of onnx files
https://drive.google.com/drive/folders/1cekVdADFAKJVcGsTYmy-yKY79z15X4wZ?usp=sharing

Kindly let me know where might be the issue and any possible work around for it. 

Thanks,
Sourabh