Use environment:
SDK8.0 SDK8.2
onnx model MD5: 83053e0035e9eeeaf0b47c82ced81865 up_512.onnx
tidl_import scrip MD5: aaef728805795a7d1ea498263d37d901 tidl_import.txt
data: demo.tar.xz
Problem occurs:
Lost output branch after quantization. The original model output has 4 tensors, namely out00, out0, out10, and out1, but the quantized model has only 3 tensors, namely out0, out10, and out1.