This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TDA4VM: Quantized PyTorch MobileNetV2 Model Support

Part Number: TDA4VM

Hi,

When I try to run a quantized PyTorch MobileNetV2 model on the TI edgeAI cloud, the kernel dies. I tried attaching the ONNX file to this post, but it can't be uploaded for some reason...

Does TIDL support models that have been converted to the INT8 format using the PyTorch Quantization API?: https://pytorch.org/docs/stable/quantization.html#general-quantization-flow

Thank you,

Isidora Radovanovic