This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
Hello Team,
In reference to the related question (e2e.ti.com/.../tda4vm-mnist-model-import-error-tidl), I was able to succesfully import and infer the mnist model after upgrading my SDK version to 8.
Now, when I try to infer using TIOVX framework, I observe the inference results are not matching. The class labels are same for all numbers.
Please find the inference results in TIDL.
Kindly check if its right and advice me where I am going wrong.
Also, Please do connect to me regarding the TI representative support for my organisation
Thanks and Regards,
Padmasree N.
Can you please share the application that you are using for "TIOVX framework based test".
You can also try TFlite/Onnxrt for running this model by referring to below
https://github.com/TexasInstruments/edgeai-tidl-tools
Ntoe : Please checkout to right SDK version this repo, based the SDK that you are using in EVM
Hello Kumar,
Thanks for your reply!
I used TIDL import and infer tool to validate my network. I followed the below links for the same.
I have few questions regarding TIDL-OSRT.
1) TIDL- RT has OpenVX framework which takes care of hardware acceleration. How does the same happen with TIDL-OSRT ?
2) Do you recommend TIDL-OSRT as a final complete embedded product to be used in production ?
Kindly resolve my queries.
Regards,
Padmasree N.
TIDL-OSRT. internally uses TIDL- RT for all the supported layers acceleration and uses open standard APIs for ease of use. Also supports running un supported layers in ARM.
Refer table in the below page for more details
Hello Kumar,
Thanks for your reply! I have referred the docs you shared. But still I am not clear about my query.
Please do answer my (2) query above
Regards,
Padmasree N.
Yes, We recommend users to use OSRT for a better user experience and a richer coverage of neural network.
If your application safety requirement can NOT be met with this, then we would recommend to use TIDL-OSRT python interface foe model compilation and validation. Then use the compiled artifacts with TIDL-RT for final deployment on device.
Hello Kumar,
Thanks for your reply!
For model compilation and validation, I have used TIDL import and infer tool to validate my network. I followed the below links for the same.
Kindly let me know if this is correct or I need to use only TIDL-OSRT for the same?
Regards,
Padmasree N.
It is not mandatory to use TIDL-OSRT , but debugging function issue would be easer with TIDL-OSRT