Hello Team,
Please do resolve my below query.
Pre and Post processing are done inTIDL-RT using OpenVX.
How does the same done in TIDL-OSRT? Will EdgeAITools take care of these things?
Thanks in Advance!
Regards,
Padmasree N.
This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
Hello Team,
Please do resolve my below query.
Pre and Post processing are done inTIDL-RT using OpenVX.
How does the same done in TIDL-OSRT? Will EdgeAITools take care of these things?
Thanks in Advance!
Regards,
Padmasree N.
Hi Padmasree,
Yes edgeai-tidl-tools do take care of pre and post processing for both python and C APIs. This support is added for a few network types as part of the examples provided, however it can be updated to support other types of pre/post processing as required.
Let me know if there are any further questions.
Regards,
Anand
Hello Anand,
Thanks for your reply!
Please do share me the pre and post processing examples in EdgeAI tools.
Kindly answer my below queries.
1) Does TIDL-OSRT requires a unique SDK and unique board. Can it be executed on TDA4x board.
2) Are YOLOv5, Yolact, EAST models supported in TIDL-RT ? (Provided the model has only TIDL supported layers)
3) Will tivx_utils_bmp_file_read() function work on Grayscale BMP images?
Regards,
Padmasree N.
Hi Padmasree,
As an example of python post processing, you can refer to following function: https://github.com/TexasInstruments/edgeai-tidl-tools/blob/1942c3f611839b38fc0fbf83b3eb22e7064eeb3f/examples/osrt_python/common_utils.py#L295
Python pre-processing is done as part of following function (handles image resizing, mean/std manipulation, etc): https://github.com/TexasInstruments/edgeai-tidl-tools/blob/1942c3f611839b38fc0fbf83b3eb22e7064eeb3f/examples/osrt_python/ort/onnxrt_ep.py#L89
For cpp examples, you may refer the preprocees and postprocess folders here: https://github.com/TexasInstruments/edgeai-tidl-tools/tree/master/examples/osrt_cpp
Please note, these are just some examples of pre and post processing provided as part of the demo examples, you may have to refer to these and update them to suit your pre/post processing requirements.
Regarding follow-up questions:
1) TIDL-OSRT can be executed on TDA-4x boards, with the same SDK.
2) Yes, any model which has all TIDL supported layers will be supported on TIDL-OSRT. Even if the model has some unsupported layers, it will work on TIDL-OSRT with TIDL supported layers delegated to accelerator and unsupported layers delegated to ARM using the native runtime inference libraries for execution.
3) I did not quite get this question. In case you are using the python examples outlined in https://github.com/TexasInstruments/edgeai-tidl-tools, you can implement the file read in python/cpp examples, and should not require adding processing in tiovx.
Regards,
Anand
Hello Anand,
Thank you for your detailed reply!
I have few more queries.
1) TIDL- RT has OpenVX framework which takes care of hardware acceleration. How does the same happen with TIDL-OSRT ?
2) Do you recommend TIDL-OSRT as a final complete embedded product to be used in production ?
Kindly resolve my above queries.
Thanks in Advance!
Regards,
Padmasree N.
Hi Padmasree,
I see Kumar has already responded to this query here : https://e2e.ti.com/support/processors-group/processors/f/processors-forum/1197274/tda4vm-mnist-model---wrong-inference-with-openvx/4533101#4533101
Hope the response helps.
Regards,
Anand