This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

J742S2XH01EVM: Cpp Inference Examples

Part Number: J742S2XH01EVM
Other Parts Discussed in Thread: AM69A

Hi Team,

Posting on behalf of our customer.

I've been compiling the CPP examples provided by edge-ai-tools. However, I can't get them to work on the board. When I run `/tidl_classification`, for example, I get an OpenCV error in the `resize` function.

On the other hand, I've tried to create my own inference examples. I was able to do it in Python because the `onnxruntime-tidl` SDK includes the `TIDLExecutionProvider`. However, CPP doesn't include the provider, which is why the examples use the `itidl_rt.h` library directly, instead of working with `onnxruntime` as in Python. I need to know if it's possible to add the `TiDL Execution` provider to the CPP `onnxruntime`, for example.

Regards,

Danilo