Hi,
I have an issue using the https://dev.ti.com/edgeaisession/ and the benchmarker.
I have developped a quantization method for neural networks using onnx and I wanted to test it with the TIDLExecutionProvider as I need to deploy it for one of my customer
I have used the custom-model-onnx and I uploaded the following model based on Mobilenet.
As soon as I try to load it with the inference engine of TI.
The jupyter kernel dies.
I have then tried with a non quantized model but with no more success, the jupyter dies as soon as I try to use the TIDLExecutionProvider
Don't really know how to move forward.
I can add the model for anyone to test but I am not sure how to do it
Thanks