Hi,
We are trying to convert an SSD ONNX model using modelimport with meta arch SSD configuration but we are facing the following issue:
Warning : Couldn't find corresponding ioBuf tensor for onnx tensor with matching name
Segmentation fault (core dumped)
We successfully converted and ran the model without using meta arch. When adding meta arch to importer configuration, the detection output layer is probably not configured properly and it doesn't seems to be connected to a viable output tensor. Segmentation fault occurs because output tensor empty and invalid.
Here is the end of the net_log.txt file:
57|TIDL_ConvolutionLayer |BoxPredictor_5/ClassPredictor/BiasAdd | 0| 1| 1| 55 x x x x x x x | 57 | 1 128 3 4 | 1 42 3 4 | 1612800 |
58|TIDL_DetectionOutputLayer |tidl_ssd_detection_output_layer | 0| 12| 1| 56 53 45 36 27 18 57 54 46 37 28 19 | 58 | 1 24 3 4 | 1 1 1 704 | 0 |
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Total Giga Macs : 18.6632
We followed instruction on this page:
We couldn't find any examples in the SDK (ti-processor-sdk-rtos-j721e-evm-08_01_00_13/tidl_j7_08_01_00_05) containing a meta arch prototxt for ONNX SSD.
Could you provide a configuration example of meta arch prototxt for ONNX SSD along with the corresponding .onnx model file and importer config ? So we can see how detection output layer is configured with respect to input and output layers. For instance, you could gave us the config files required to import the ONR-OD-8020-ssd-lite-mobv2-coco-512x512 model from the model zoo.
I have attached the config file we are using.
Thanks