Part Number: PROCESSOR-SDK-AM68A
Tool/software:
I am trying to launch example edgeai gstreamer applications in SDK10, and I am facing a problem. When using the default pre-built image from https://www.ti.com/tool/PROCESSOR-SDK-AM68A, I am able to launch the system and try demo applications by going to /opt/edgeai-gst-apps/apps_python and launching using python3 app_edgeai.py ../configs/<some_config>.yaml, and everything works as expected.
However, when I build the prepared docker (by calling ./docker_build.sh in /opt/edgeai-get-apps/docker), and then run that docker, not all the models work. For example, I can run the TFL-CL-0000-mobileNetV1-mlperf, but not the ONR-CL-6360-regNetx-200mf. It seems that the problem is with all the models that are launched using onnxruntime, as with all of them I get the following output:
[docker] root@am68-sk:/opt/edgeai-gst-apps/apps_python# python3 app_edgeai.py ../configs/image_classification.yaml
libtidl_onnxrt_EP loaded 0x154c95a0
Segmentation fault (core dumped)
I also observe the same behavior when using apps_cpp instead of apps_python, and for both SDK versions 10.00 and 10.01 . Is there some workaround that would allow me launching ONR models from docker?