I want to know whether we can run and take inference on custom downloaded YoloV3-Tiny.
This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
Hi,
You can compile and run the model inference using our edgeai tidl tools repos here : https://github.com/TexasInstruments/edgeai-tidl-tools
Please go through our documentation on the same.
Thank You