This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

PROCESSOR-SDK-AM62A: Setup error on docker condition

Part Number: PROCESSOR-SDK-AM62A

Hello
I'm trying to make environment to build custom model for AM62A.
https://github.com/TexasInstruments/edgeai-tidl-tools/tree/09_00_00_06

When I perfrom "source .setup.sh" on docker container, I got following error.

ubuntu@ubuntu-virtual-machine:~/edgeai-tidl-tools$ sudo docker run -it --shm-size=4096m --mount source=$(pwd),target=/home/root,type=bind x86_ubuntu_22
root@3a3ab7fe5081:/# cd /home/root/
root@3a3ab7fe5081:/home/root# export SOC=am62a
root@3a3ab7fe5081:/home/root#  source ./setup.sh 
X64 Architecture
Installing python packages...
Collecting pybind11[global]
  Downloading pybind11-2.11.1-py3-none-any.whl (227 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 227.7/227.7 KB 1.7 MB/s eta 0:00:00
Collecting pybind11-global==2.11.1
  Downloading pybind11_global-2.11.1-py3-none-any.whl (412 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 412.2/412.2 KB 5.0 MB/s eta 0:00:00
Installing collected packages: pybind11-global, pybind11
Successfully installed pybind11-2.11.1 pybind11-global-2.11.1
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
Collecting https://github.com/TexasInstruments/onnx/archive/tidl-j7.zip (from -r ./requirements_pc.txt (line 4))
  Downloading https://github.com/TexasInstruments/onnx/archive/tidl-j7.zip
     - 11.4 MB 14.5 MB/s 0:00:01
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  error: subprocess-exited-with-error
  
  × Getting requirements to build wheel did not run successfully.
  │ exit code: 1
  ╰─> [18 lines of output]
      Traceback (most recent call last):
        File "/usr/lib/python3/dist-packages/pip/_vendor/pep517/in_process/_in_process.py", line 363, in <module>
          main()
        File "/usr/lib/python3/dist-packages/pip/_vendor/pep517/in_process/_in_process.py", line 345, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
        File "/usr/lib/python3/dist-packages/pip/_vendor/pep517/in_process/_in_process.py", line 130, in get_requires_for_build_wheel
          return hook(config_settings)
        File "/usr/lib/python3/dist-packages/setuptools/build_meta.py", line 162, in get_requires_for_build_wheel
          return self._get_build_requires(
        File "/usr/lib/python3/dist-packages/setuptools/build_meta.py", line 143, in _get_build_requires
          self.run_setup()
        File "/usr/lib/python3/dist-packages/setuptools/build_meta.py", line 267, in run_setup
          super(_BuildMetaLegacyBackend,
        File "/usr/lib/python3/dist-packages/setuptools/build_meta.py", line 158, in run_setup
          exec(compile(code, __file__, 'exec'), locals())
        File "setup.py", line 86, in <module>
          assert CMAKE, 'Could not find "cmake" executable!'
      AssertionError: Could not find "cmake" executable!
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.
Requirement already satisfied: pybind11[global] in /usr/local/lib/python3.10/dist-packages (2.11.1)
Requirement already satisfied: pybind11-global==2.11.1 in /usr/local/lib/python3.10/dist-packages (from pybind11[global]) (2.11.1)
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
Installing python osrt packages...
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
Downloading tidl tools for AM62A SOC ...
bash: wget: command not found
tar (child): tidl_tools.tar.gz: Cannot open: No such file or directory
tar (child): Error is not recoverable: exiting now
tar: Child returned status 2
tar: Error is not recoverable: exiting now
bash: cd: tidl_tools: No such file or directory
bash: wget: command not found
tar: gcc-arm-9.2-2019.12-x86_64-aarch64-none-linux-gnu.tar.xz: Cannot open: No such file or directory
tar: Error is not recoverable: exiting now
bash: wget: command not found
chmod: cannot access 'ti_cgt_c7000_3.1.0.LTS_linux-x64_installer.bin': No such file or directory
bash: ./ti_cgt_c7000_3.1.0.LTS_linux-x64_installer.bin: No such file or directory
Installing:onnxruntime
bash: wget: command not found
tar: onnx_1.7.0_x86_u22.tar.gz: Cannot open: No such file or directory
tar: Error is not recoverable: exiting now
bash: cd: onnx_1.7.0_x86_u22: No such file or directory
rm: cannot remove 'onnx_1.7.0_x86_u22.tar.gz': No such file or directory
skipping tensorflow setup: found /home/root/osrt_deps/tensorflow
To redo the setup delete:/home/root/osrt_deps/tensorflow and run this script again
skipping opencv-4.2.0 setup: found /home/root/osrt_deps/opencv-4.2.0_x86_u22
To redo the setup delete:/home/root/osrt_deps/opencv-4.2.0_x86_u22 and run this script again
skipping neo-ai-dlr setup: found /home/root/osrt_deps/neo-ai-dlr
To redo the setup delete:/home/root/osrt_deps/neo-ai-dlr and run this script again

  • From looking at your error message it looks like the first/fatal issue is this one...

    bash: wget: command not found

    ...which should be fixable by installing the missing package using apt install wget within the container environment. Then once the downloads start working, many of the other issues will probably go away.

    Regards, Andreas

  • after install wget i got :

    root@aa3a660145c9:/home/root# source ./setup.sh 
    X64 Architecture
    Installing python packages...
    Requirement already satisfied: pybind11[global] in /usr/local/lib/python3.10/dist-packages (2.11.1)
    Requirement already satisfied: pybind11-global==2.11.1 in /usr/local/lib/python3.10/dist-packages (from pybind11[global]) (2.11.1)
    WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
    Collecting https://github.com/TexasInstruments/onnx/archive/tidl-j7.zip (from -r ./requirements_pc.txt (line 4))
      Using cached https://github.com/TexasInstruments/onnx/archive/tidl-j7.zip
      Installing build dependencies ... done
      Getting requirements to build wheel ... error
      error: subprocess-exited-with-error
      
      × Getting requirements to build wheel did not run successfully.
      │ exit code: 1
      ╰─> [18 lines of output]
          Traceback (most recent call last):
            File "/usr/lib/python3/dist-packages/pip/_vendor/pep517/in_process/_in_process.py", line 363, in <module>
              main()
            File "/usr/lib/python3/dist-packages/pip/_vendor/pep517/in_process/_in_process.py", line 345, in main
              json_out['return_val'] = hook(**hook_input['kwargs'])
            File "/usr/lib/python3/dist-packages/pip/_vendor/pep517/in_process/_in_process.py", line 130, in get_requires_for_build_wheel
              return hook(config_settings)
            File "/usr/lib/python3/dist-packages/setuptools/build_meta.py", line 162, in get_requires_for_build_wheel
              return self._get_build_requires(
            File "/usr/lib/python3/dist-packages/setuptools/build_meta.py", line 143, in _get_build_requires
              self.run_setup()
            File "/usr/lib/python3/dist-packages/setuptools/build_meta.py", line 267, in run_setup
              super(_BuildMetaLegacyBackend,
            File "/usr/lib/python3/dist-packages/setuptools/build_meta.py", line 158, in run_setup
              exec(compile(code, __file__, 'exec'), locals())
            File "setup.py", line 86, in <module>
              assert CMAKE, 'Could not find "cmake" executable!'
          AssertionError: Could not find "cmake" executable!
          [end of output]
      
      note: This error originates from a subprocess, and is likely not a problem with pip.
    error: subprocess-exited-with-error
    
    × Getting requirements to build wheel did not run successfully.
    │ exit code: 1
    ╰─> See above for output.
    
    note: This error originates from a subprocess, and is likely not a problem with pip.
    Requirement already satisfied: pybind11[global] in /usr/local/lib/python3.10/dist-packages (2.11.1)
    Requirement already satisfied: pybind11-global==2.11.1 in /usr/local/lib/python3.10/dist-packages (from pybind11[global]) (2.11.1)
    WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
    Installing python osrt packages...
    WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
    WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
    WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
    WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
    skipping gcc-arm-9.2-2019.12-x86_64-aarch64-none-linux-gnu download: found /home/root/gcc-arm-9.2-2019.12-x86_64-aarch64-none-linux-gnu
    skipping ti-cgt-c7000_3.1.0.LTS download: found /home/root/ti-cgt-c7000_3.1.0.LTS
    Installing:onnxruntime
    Installing:tflite_2.8
    Installing:opencv
    Installing:dlr
    

  • Hello,

    It looks like one of the components failed to build because cmake is not installed on the container. Could you please install this as well and try again?

    You can also safely ignore the warnings from PIP about the global environment.

    Best,
    Reese

  • I solved this issue,but when I try to perform "cmake ../examples && make -j && cd .." after performing "setup.sh" on docker condition, I got following error.

    (base) ubuntu@ubuntu-virtual-machine:~/edgeai-tidl-tools/build$ cmake ../examples && make -j && cd ..
    CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required):
      Compatibility with CMake < 3.5 will be removed from a future version of
      CMake.
    
      Update the VERSION argument <min> value or use a ...<max> suffix to tell
      CMake that the project does not need compatibility with older versions.
    
    
    -- Detected processor: x86_64
    -- TARGET_DEVICE setting to: am62a
    -- TARGET_CPU not specicfied using x86 
    CMake Deprecation Warning at osrt_cpp/post_process/CMakeLists.txt:1 (cmake_minimum_required):
      Compatibility with CMake < 3.5 will be removed from a future version of
      CMake.
    
      Update the VERSION argument <min> value or use a ...<max> suffix to tell
      CMake that the project does not need compatibility with older versions.
    
    
    -- CMAKE_BUILD_TYPE = Release PROJECT_NAME = edgeai_tidl_examples
    -- setting TENSORFLOW_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/tflite_2.8_x86_u22/
    -- setting DLR_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/dlr_1.10.0_x86_u22/
    -- setting OPENCV_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/opencv_4.2.0_x86_u22/
    -- Compiling for x86 with am62a config
    CMake Deprecation Warning at osrt_cpp/pre_process/CMakeLists.txt:1 (cmake_minimum_required):
      Compatibility with CMake < 3.5 will be removed from a future version of
      CMake.
    
      Update the VERSION argument <min> value or use a ...<max> suffix to tell
      CMake that the project does not need compatibility with older versions.
    
    
    -- CMAKE_BUILD_TYPE = Release PROJECT_NAME = edgeai_tidl_examples
    -- setting TENSORFLOW_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/tflite_2.8_x86_u22/
    -- setting DLR_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/dlr_1.10.0_x86_u22/
    -- setting OPENCV_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/opencv_4.2.0_x86_u22/
    -- Compiling for x86 with am62a config
    CMake Deprecation Warning at osrt_cpp/utils/CMakeLists.txt:1 (cmake_minimum_required):
      Compatibility with CMake < 3.5 will be removed from a future version of
      CMake.
    
      Update the VERSION argument <min> value or use a ...<max> suffix to tell
      CMake that the project does not need compatibility with older versions.
    
    
    -- CMAKE_BUILD_TYPE = Release PROJECT_NAME = edgeai_tidl_examples
    -- setting TENSORFLOW_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/tflite_2.8_x86_u22/
    -- setting DLR_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/dlr_1.10.0_x86_u22/
    -- setting OPENCV_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/opencv_4.2.0_x86_u22/
    -- Compiling for x86 with am62a config
    CMake Deprecation Warning at osrt_cpp/tfl/CMakeLists.txt:1 (cmake_minimum_required):
      Compatibility with CMake < 3.5 will be removed from a future version of
      CMake.
    
      Update the VERSION argument <min> value or use a ...<max> suffix to tell
      CMake that the project does not need compatibility with older versions.
    
    
    -- CMAKE_BUILD_TYPE = Release PROJECT_NAME = tfl_main
    -- setting TENSORFLOW_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/tflite_2.8_x86_u22/
    -- setting DLR_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/dlr_1.10.0_x86_u22/
    -- setting OPENCV_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/opencv_4.2.0_x86_u22/
    -- Compiling for x86 with am62a config
    CMake Deprecation Warning at osrt_cpp/advanced_examples/tfl/CMakeLists.txt:1 (cmake_minimum_required):
      Compatibility with CMake < 3.5 will be removed from a future version of
      CMake.
    
      Update the VERSION argument <min> value or use a ...<max> suffix to tell
      CMake that the project does not need compatibility with older versions.
    
    
    -- CMAKE_BUILD_TYPE = Release PROJECT_NAME = tfl_priority_scheduling
    -- setting TENSORFLOW_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/tflite_2.8_x86_u22/
    -- setting DLR_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/dlr_1.10.0_x86_u22/
    -- setting OPENCV_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/opencv_4.2.0_x86_u22/
    -- Compiling for x86 with am62a config
    CMake Deprecation Warning at osrt_cpp/advanced_examples/ort/CMakeLists.txt:1 (cmake_minimum_required):
      Compatibility with CMake < 3.5 will be removed from a future version of
      CMake.
    
      Update the VERSION argument <min> value or use a ...<max> suffix to tell
      CMake that the project does not need compatibility with older versions.
    
    
    -- CMAKE_BUILD_TYPE = Release PROJECT_NAME = ort_priority_scheduling
    -- setting TENSORFLOW_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/tflite_2.8_x86_u22/
    -- setting DLR_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/dlr_1.10.0_x86_u22/
    -- setting OPENCV_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/opencv_4.2.0_x86_u22/
    -- Compiling for x86 with am62a config
    CMake Deprecation Warning at osrt_cpp/advanced_examples/utils/CMakeLists.txt:1 (cmake_minimum_required):
      Compatibility with CMake < 3.5 will be removed from a future version of
      CMake.
    
      Update the VERSION argument <min> value or use a ...<max> suffix to tell
      CMake that the project does not need compatibility with older versions.
    
    
    -- CMAKE_BUILD_TYPE = Release PROJECT_NAME = edgeai_tidl_examples
    -- setting TENSORFLOW_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/tflite_2.8_x86_u22/
    -- setting DLR_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/dlr_1.10.0_x86_u22/
    -- setting OPENCV_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/opencv_4.2.0_x86_u22/
    -- Compiling for x86 with am62a config
    CMake Deprecation Warning at tidlrt_cpp/CMakeLists.txt:1 (cmake_minimum_required):
      Compatibility with CMake < 3.5 will be removed from a future version of
      CMake.
    
      Update the VERSION argument <min> value or use a ...<max> suffix to tell
      CMake that the project does not need compatibility with older versions.
    
    
    -- CMAKE_BUILD_TYPE = Release PROJECT_NAME = tidlrt_clasification
    -- setting TENSORFLOW_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/tflite_2.8_x86_u22/
    -- setting DLR_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/dlr_1.10.0_x86_u22/
    -- setting OPENCV_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/opencv_4.2.0_x86_u22/
    -- Compiling for x86 with am62a config
    CMake Deprecation Warning at osrt_cpp/dlr/CMakeLists.txt:1 (cmake_minimum_required):
      Compatibility with CMake < 3.5 will be removed from a future version of
      CMake.
    
      Update the VERSION argument <min> value or use a ...<max> suffix to tell
      CMake that the project does not need compatibility with older versions.
    
    
    -- CMAKE_BUILD_TYPE = Release PROJECT_NAME = dlr_main
    -- setting TENSORFLOW_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/tflite_2.8_x86_u22/
    -- setting DLR_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/dlr_1.10.0_x86_u22/
    -- setting OPENCV_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/opencv_4.2.0_x86_u22/
    -- Compiling for x86 with am62a config
    CMake Deprecation Warning at osrt_cpp/ort/CMakeLists.txt:1 (cmake_minimum_required):
      Compatibility with CMake < 3.5 will be removed from a future version of
      CMake.
    
      Update the VERSION argument <min> value or use a ...<max> suffix to tell
      CMake that the project does not need compatibility with older versions.
    
    
    -- CMAKE_BUILD_TYPE = Release PROJECT_NAME = ort_main
    -- setting TENSORFLOW_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/tflite_2.8_x86_u22/
    -- setting DLR_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/dlr_1.10.0_x86_u22/
    -- setting OPENCV_INSTALL_DIR path:/home/ubuntu/edgeai-tidl-tools/tidl_tools/osrt_deps/opencv_4.2.0_x86_u22/
    -- Compiling for x86 with am62a config
    -- Configuring done (0.1s)
    -- Generating done (0.0s)
    -- Build files have been written to: /home/ubuntu/edgeai-tidl-tools/build
    osrt_cpp/post_process/CMakeFiles/post_process.dir/flags.make:10: *** missing separator.  Stop.
    osrt_cpp/utils/CMakeFiles/utils.dir/flags.make:10: *** missing separator.  Stop.
    osrt_cpp/pre_process/CMakeFiles/pre_process.dir/flags.make:10: *** missing separator.  Stop.
    make[1]: *** [CMakeFiles/Makefile2:311: osrt_cpp/utils/CMakeFiles/utils.dir/all] Error 2
    make[1]: *** Waiting for unfinished jobs....
    make[1]: *** [CMakeFiles/Makefile2:259: osrt_cpp/post_process/CMakeFiles/post_process.dir/all] Error 2
    make[1]: *** [CMakeFiles/Makefile2:285: osrt_cpp/pre_process/CMakeFiles/pre_process.dir/all] Error 2
    osrt_cpp/advanced_examples/utils/CMakeFiles/utils_adv.dir/flags.make:10: *** missing separator.  Stop.
    make[1]: *** [CMakeFiles/Makefile2:424: osrt_cpp/advanced_examples/utils/CMakeFiles/utils_adv.dir/all] Error 2
    make: *** [Makefile:136: all] Error 2
    

  • Hello,

    What version of cmake are you using? I cannot reproduce this error on my side (cmake 3.22.1) on ubuntu 22.04 LTS. The compatibility warnings make me suspect that this may be the culprit here.

    Best,
    Reese

  • I using cmake  3.27.6

  • I'm sorry I solved cmake issue by myself,and I run "source ./scripts/run_python_examples.sh", I got following error.

    ubuntu@ubuntu-CoffeeLake:~/edgeai-tidl-tools$ source ./scripts/run_python_examples.sh
    X64 Architecture
    Traceback (most recent call last):
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/tfl/tflrt_delegate.py", line 17, in <module>
        from common_utils import *
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/common_utils.py", line 24, in <module>
        from caffe2onnx.src.load_save_model import loadcaffemodel, saveonnxmodel
    ModuleNotFoundError: No module named 'caffe2onnx'
    run python3 tflrt_delegate.py
    Traceback (most recent call last):
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/tfl/tflrt_delegate.py", line 17, in <module>
        from common_utils import *
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/common_utils.py", line 24, in <module>
        from caffe2onnx.src.load_save_model import loadcaffemodel, saveonnxmodel
    ModuleNotFoundError: No module named 'caffe2onnx'
    Traceback (most recent call last):
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/ort/onnxrt_ep.py", line 18, in <module>
        from common_utils import *
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/common_utils.py", line 24, in <module>
        from caffe2onnx.src.load_save_model import loadcaffemodel, saveonnxmodel
    ModuleNotFoundError: No module named 'caffe2onnx'
    run python3 onnxrt_ep.py
    Traceback (most recent call last):
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/ort/onnxrt_ep.py", line 18, in <module>
        from common_utils import *
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/common_utils.py", line 24, in <module>
        from caffe2onnx.src.load_save_model import loadcaffemodel, saveonnxmodel
    ModuleNotFoundError: No module named 'caffe2onnx'
    Traceback (most recent call last):
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/tvm_dlr/tvm_compilation_onnx_example.py", line 9, in <module>
        from common_utils import *
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/common_utils.py", line 24, in <module>
        from caffe2onnx.src.load_save_model import loadcaffemodel, saveonnxmodel
    ModuleNotFoundError: No module named 'caffe2onnx'
    Traceback (most recent call last):
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/tvm_dlr/tvm_compilation_tflite_example.py", line 9, in <module>
        from common_utils import *
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/common_utils.py", line 24, in <module>
        from caffe2onnx.src.load_save_model import loadcaffemodel, saveonnxmodel
    ModuleNotFoundError: No module named 'caffe2onnx'
    Traceback (most recent call last):
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/tvm_dlr/tvm_compilation_onnx_example.py", line 9, in <module>
        from common_utils import *
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/common_utils.py", line 24, in <module>
        from caffe2onnx.src.load_save_model import loadcaffemodel, saveonnxmodel
    ModuleNotFoundError: No module named 'caffe2onnx'
    Traceback (most recent call last):
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/tvm_dlr/tvm_compilation_tflite_example.py", line 9, in <module>
        from common_utils import *
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/common_utils.py", line 24, in <module>
        from caffe2onnx.src.load_save_model import loadcaffemodel, saveonnxmodel
    ModuleNotFoundError: No module named 'caffe2onnx'
    Traceback (most recent call last):
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/tvm_dlr/tvm_compilation_timm_example.py", line 9, in <module>
        from common_utils import *
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/common_utils.py", line 24, in <module>
        from caffe2onnx.src.load_save_model import loadcaffemodel, saveonnxmodel
    ModuleNotFoundError: No module named 'caffe2onnx'
    run python3  dlr_inference_example.py 
    Traceback (most recent call last):
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/tvm_dlr/dlr_inference_example.py", line 17, in <module>
        from common_utils import *
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/common_utils.py", line 24, in <module>
        from caffe2onnx.src.load_save_model import loadcaffemodel, saveonnxmodel
    ModuleNotFoundError: No module named 'caffe2onnx'
    
    

    after using "pip install caffe2onnx" i got  error
    ubuntu@ubuntu-CoffeeLake:~/edgeai-tidl-tools$ pip install caffe2onnx
    Defaulting to user installation because normal site-packages is not writeable
    Collecting caffe2onnx
      Using cached caffe2onnx-2.0.1-py3-none-any.whl
    Collecting onnx==1.6.0 (from caffe2onnx)
      Using cached onnx-1.6.0.tar.gz (3.1 MB)
      Preparing metadata (setup.py) ... done
    Requirement already satisfied: protobuf in /home/ubuntu/.local/lib/python3.10/site-packages (from caffe2onnx) (3.16.0)
    Requirement already satisfied: numpy in /home/ubuntu/.local/lib/python3.10/site-packages (from onnx==1.6.0->caffe2onnx) (1.23.0)
    Requirement already satisfied: six in /usr/lib/python3/dist-packages (from onnx==1.6.0->caffe2onnx) (1.16.0)
    Requirement already satisfied: typing-extensions>=3.6.2.1 in /home/ubuntu/.local/lib/python3.10/site-packages (from onnx==1.6.0->caffe2onnx) (4.8.0)
    Building wheels for collected packages: onnx
      Building wheel for onnx (setup.py) ... error
      error: subprocess-exited-with-error
      
      × python setup.py bdist_wheel did not run successfully.
      │ exit code: 1
      ╰─> [147 lines of output]
          fatal: not a git repository (or any of the parent directories): .git
          /usr/lib/python3/dist-packages/setuptools/dist.py:723: UserWarning: Usage of dash-separated 'license-file' will not be supported in future versions. Please use the underscore name 'license_file' instead
            warnings.warn(
          /usr/lib/python3/dist-packages/pkg_resources/__init__.py:116: PkgResourcesDeprecationWarning: 0.1.43ubuntu1 is an invalid version and will not be supported in a future release
            warnings.warn(
          /usr/lib/python3/dist-packages/pkg_resources/__init__.py:116: PkgResourcesDeprecationWarning: 1.1build1 is an invalid version and will not be supported in a future release
            warnings.warn(
          /usr/lib/python3/dist-packages/setuptools/installer.py:27: SetuptoolsDeprecationWarning: setuptools.installer is deprecated. Requirements should be satisfied by a PEP 517 installer.
            warnings.warn(
          /usr/lib/python3/dist-packages/setuptools/dist.py:723: UserWarning: Usage of dash-separated 'license-file' will not be supported in future versions. Please use the underscore name 'license_file' instead
            warnings.warn(
          running bdist_wheel
          running build
          running build_py
          running create_version
          running cmake_build
          Extra cmake args: ['-DONNX_USE_PROTOBUF_SHARED_LIBS=ON']
          -- The C compiler identification is GNU 11.4.0
          -- The CXX compiler identification is GNU 11.4.0
          -- Detecting C compiler ABI info
          -- Detecting C compiler ABI info - done
          -- Check for working C compiler: /usr/bin/cc - skipped
          -- Detecting C compile features
          -- Detecting C compile features - done
          -- Detecting CXX compiler ABI info
          -- Detecting CXX compiler ABI info - done
          -- Check for working CXX compiler: /usr/bin/c++ - skipped
          -- Detecting CXX compile features
          -- Detecting CXX compile features - done
          -- Found Protobuf: /usr/lib/x86_64-linux-gnu/libprotobuf.so (found version "3.12.4")
          Generated: /tmp/pip-install-d5wnpzds/onnx_f501035f153d4b10b2907903d83e8550/.setuptools-cmake-build/onnx/onnx-ml.proto
          Generated: /tmp/pip-install-d5wnpzds/onnx_f501035f153d4b10b2907903d83e8550/.setuptools-cmake-build/onnx/onnx-operators-ml.proto
          CMake Warning at CMakeLists.txt:394 (find_package):
            By not providing "Findpybind11.cmake" in CMAKE_MODULE_PATH this project has
            asked CMake to find a package configuration file provided by "pybind11",
            but CMake did not find one.
          
            Could not find a package configuration file provided by "pybind11"
            (requested version 2.2) with any of the following names:
          
              pybind11Config.cmake
              pybind11-config.cmake
          
            Add the installation prefix of "pybind11" to CMAKE_PREFIX_PATH or set
            "pybind11_DIR" to a directory containing one of the above files.  If
            "pybind11" provides a separate development package or SDK, be sure it has
            been installed.
          
          
          --
          -- ******** Summary ********
          --   CMake version         : 3.22.1
          --   CMake command         : /opt/cmake-3.22.1/bin/cmake
          --   System                : Linux
          --   C++ compiler          : /usr/bin/c++
          --   C++ compiler version  : 11.4.0
          --   CXX flags             :  -Wnon-virtual-dtor
          --   Build type            : Release
          --   Compile definitions   :
          --   CMAKE_PREFIX_PATH     :
          --   CMAKE_INSTALL_PREFIX  : /usr/local
          --   CMAKE_MODULE_PATH     :
          --
          --   ONNX version          : 1.6.0
          --   ONNX NAMESPACE        : onnx
          --   ONNX_BUILD_TESTS      : OFF
          --   ONNX_BUILD_BENCHMARKS : OFF
          --   ONNX_USE_LITE_PROTO   : OFF
          --   ONNXIFI_DUMMY_BACKEND : OFF
          --   ONNXIFI_ENABLE_EXT    : OFF
          --
          --   Protobuf compiler     : /usr/bin/protoc
          --   Protobuf includes     : /usr/include
          --   Protobuf libraries    : /usr/lib/x86_64-linux-gnu/libprotobuf.so
          --   BUILD_ONNX_PYTHON     : ON
          --     Python version      :
          --     Python executable   : /usr/bin/python3
          --     Python includes     : /usr/include/python3.10
          -- Configuring done
          -- Generating done
          CMake Warning:
            Manually-specified variables were not used by the project:
          
              ONNX_USE_PROTOBUF_SHARED_LIBS
          
          
          -- Build files have been written to: /tmp/pip-install-d5wnpzds/onnx_f501035f153d4b10b2907903d83e8550/.setuptools-cmake-build
          [  1%] Running gen_proto.py on onnx/onnx.in.proto
          [  3%] Building C object CMakeFiles/onnxifi_loader.dir/onnx/onnxifi_loader.c.o
          [  4%] Building C object CMakeFiles/onnxifi_dummy.dir/onnx/onnxifi_dummy.c.o
          /tmp/pip-install-d5wnpzds/onnx_f501035f153d4b10b2907903d83e8550/onnx/onnxifi_dummy.c: In function ‘onnxGetExtensionFunctionAddress’:
          /tmp/pip-install-d5wnpzds/onnx_f501035f153d4b10b2907903d83e8550/onnx/onnxifi_dummy.c:173:21: warning: assignment to ‘onnxExtensionFunctionPointer’ {aka ‘int (*)(void)’} from incompatible pointer type ‘onnxStatus (*)(void *, const char *, onnxStatus (**)(void))’ {aka ‘int (*)(void *, const char *, int (**)(void))’} [-Wincompatible-pointer-types]
            173 |           *function = &onnxGetExtensionFunctionAddress;
                |                     ^
          /tmp/pip-install-d5wnpzds/onnx_f501035f153d4b10b2907903d83e8550/onnx/onnxifi_dummy.c:176:21: warning: assignment to ‘onnxExtensionFunctionPointer’ {aka ‘int (*)(void)’} from incompatible pointer type ‘onnxStatus (*)(void *, uint32_t,  const onnxTensorDescriptorV1 *, uint32_t,  const onnxTensorDescriptorV1 *, onnxMemoryFenceV1 *)’ {aka ‘int (*)(void *, unsigned int,  const onnxTensorDescriptorV1 *, unsigned int,  const onnxTensorDescriptorV1 *, onnxMemoryFenceV1 *)’} [-Wincompatible-pointer-types]
            176 |           *function = &onnxSetIOAndRunGraph;
                |                     ^
          Processing /tmp/pip-install-d5wnpzds/onnx_f501035f153d4b10b2907903d83e8550/onnx/onnx.in.proto
          Writing /tmp/pip-install-d5wnpzds/onnx_f501035f153d4b10b2907903d83e8550/.setuptools-cmake-build/onnx/onnx-ml.proto
          Writing /tmp/pip-install-d5wnpzds/onnx_f501035f153d4b10b2907903d83e8550/.setuptools-cmake-build/onnx/onnx-ml.proto3
          generating /tmp/pip-install-d5wnpzds/onnx_f501035f153d4b10b2907903d83e8550/.setuptools-cmake-build/onnx/onnx_pb.py
          [  6%] Running C++ protocol buffer compiler on /tmp/pip-install-d5wnpzds/onnx_f501035f153d4b10b2907903d83e8550/.setuptools-cmake-build/onnx/onnx-ml.proto
          [  8%] Linking C static library libonnxifi_loader.a
          [  9%] Linking C shared library libonnxifi_dummy.so
          --python_out: onnx/onnx-ml.proto: Unknown generator option: dllexport_decl
          gmake[2]: *** [CMakeFiles/gen_onnx_proto.dir/build.make:74: onnx/onnx-ml.pb.cc] Error 1
          gmake[1]: *** [CMakeFiles/Makefile2:95: CMakeFiles/gen_onnx_proto.dir/all] Error 2
          gmake[1]: *** Waiting for unfinished jobs....
          [  9%] Built target onnxifi_loader
          [  9%] Built target onnxifi_dummy
          gmake: *** [Makefile:136: all] Error 2
          Traceback (most recent call last):
            File "<string>", line 2, in <module>
            File "<pip-setuptools-caller>", line 34, in <module>
            File "/tmp/pip-install-d5wnpzds/onnx_f501035f153d4b10b2907903d83e8550/setup.py", line 315, in <module>
              setuptools.setup(
            File "/usr/lib/python3/dist-packages/setuptools/__init__.py", line 153, in setup
              return distutils.core.setup(**attrs)
            File "/usr/lib/python3.10/distutils/core.py", line 148, in setup
              dist.run_commands()
            File "/usr/lib/python3.10/distutils/dist.py", line 966, in run_commands
              self.run_command(cmd)
            File "/usr/lib/python3.10/distutils/dist.py", line 985, in run_command
              cmd_obj.run()
            File "/usr/lib/python3/dist-packages/wheel/bdist_wheel.py", line 299, in run
              self.run_command('build')
            File "/usr/lib/python3.10/distutils/cmd.py", line 313, in run_command
              self.distribution.run_command(command)
            File "/usr/lib/python3.10/distutils/dist.py", line 985, in run_command
              cmd_obj.run()
            File "/usr/lib/python3.10/distutils/command/build.py", line 135, in run
              self.run_command(cmd_name)
            File "/usr/lib/python3.10/distutils/cmd.py", line 313, in run_command
              self.distribution.run_command(command)
            File "/usr/lib/python3.10/distutils/dist.py", line 985, in run_command
              cmd_obj.run()
            File "/tmp/pip-install-d5wnpzds/onnx_f501035f153d4b10b2907903d83e8550/setup.py", line 209, in run
              self.run_command('cmake_build')
            File "/usr/lib/python3.10/distutils/cmd.py", line 313, in run_command
              self.distribution.run_command(command)
            File "/usr/lib/python3.10/distutils/dist.py", line 985, in run_command
              cmd_obj.run()
            File "/tmp/pip-install-d5wnpzds/onnx_f501035f153d4b10b2907903d83e8550/setup.py", line 203, in run
              subprocess.check_call(build_args)
            File "/usr/lib/python3.10/subprocess.py", line 369, in check_call
              raise CalledProcessError(retcode, cmd)
          subprocess.CalledProcessError: Command '['/usr/bin/cmake', '--build', '.', '--', '-j', '12']' returned non-zero exit status 2.
          [end of output]
      
      note: This error originates from a subprocess, and is likely not a problem with pip.
      ERROR: Failed building wheel for onnx
      Running setup.py clean for onnx
    Failed to build onnx
    ERROR: Could not build wheels for onnx, which is required to install pyproject.toml-based projects
    
    

  • Hi Luke,

    not sure what Linux distribution you are using but can you please try the steps from the docs on a standard Ubuntu 22.04 install. This is what we support, and what is assumed for the context of our current TI v9.x SDKs. Since your cmake version is much newer than what a standard Ubuntu 22.04 install comes with it makes me wonder what you are trying to run this on.

    Regards, Andreas

  • I'm using on Ubuntu 22.04.02 LTS

  • Can you please try setting up everything from scratch on a fresh install of Ubuntu 22.04. I do not understand why you have such a different version of cmake.

    Regards, Andreas

  • I redo "source ./scripts/run_python_examples.sh" on new Ubuntu 22.04 cmake version is 3.22.1,but i got 

    (base) ubuntu@ubuntu-CoffeeLake:~/edgeai-tidl-tools$ source ./scripts/run_python_examples.sh
    X64 Architecture
    Traceback (most recent call last):
      File "tflrt_delegate.py", line 17, in <module>
        from common_utils import *
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/common_utils.py", line 21, in <module>
        from scripts.osrt_model_tools.tflite_tools import tflite_model_opt as tflOpt
    ModuleNotFoundError: No module named 'scripts.osrt_model_tools'
    run python3 tflrt_delegate.py
    Traceback (most recent call last):
      File "tflrt_delegate.py", line 17, in <module>
        from common_utils import *
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/common_utils.py", line 21, in <module>
        from scripts.osrt_model_tools.tflite_tools import tflite_model_opt as tflOpt
    ModuleNotFoundError: No module named 'scripts.osrt_model_tools'
    Traceback (most recent call last):
      File "onnxrt_ep.py", line 18, in <module>
        from common_utils import *
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/common_utils.py", line 21, in <module>
        from scripts.osrt_model_tools.tflite_tools import tflite_model_opt as tflOpt
    ModuleNotFoundError: No module named 'scripts.osrt_model_tools'
    run python3 onnxrt_ep.py
    Traceback (most recent call last):
      File "onnxrt_ep.py", line 18, in <module>
        from common_utils import *
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/common_utils.py", line 21, in <module>
        from scripts.osrt_model_tools.tflite_tools import tflite_model_opt as tflOpt
    ModuleNotFoundError: No module named 'scripts.osrt_model_tools'
    Traceback (most recent call last):
      File "tvm_compilation_onnx_example.py", line 9, in <module>
        from common_utils import *
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/common_utils.py", line 21, in <module>
        from scripts.osrt_model_tools.tflite_tools import tflite_model_opt as tflOpt
    ModuleNotFoundError: No module named 'scripts.osrt_model_tools'
    Traceback (most recent call last):
      File "tvm_compilation_tflite_example.py", line 9, in <module>
        from common_utils import *
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/common_utils.py", line 21, in <module>
        from scripts.osrt_model_tools.tflite_tools import tflite_model_opt as tflOpt
    ModuleNotFoundError: No module named 'scripts.osrt_model_tools'
    Traceback (most recent call last):
      File "tvm_compilation_onnx_example.py", line 9, in <module>
        from common_utils import *
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/common_utils.py", line 21, in <module>
        from scripts.osrt_model_tools.tflite_tools import tflite_model_opt as tflOpt
    ModuleNotFoundError: No module named 'scripts.osrt_model_tools'
    Traceback (most recent call last):
      File "tvm_compilation_tflite_example.py", line 9, in <module>
        from common_utils import *
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/common_utils.py", line 21, in <module>
        from scripts.osrt_model_tools.tflite_tools import tflite_model_opt as tflOpt
    ModuleNotFoundError: No module named 'scripts.osrt_model_tools'
    Traceback (most recent call last):
      File "tvm_compilation_timm_example.py", line 9, in <module>
        from common_utils import *
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/common_utils.py", line 21, in <module>
        from scripts.osrt_model_tools.tflite_tools import tflite_model_opt as tflOpt
    ModuleNotFoundError: No module named 'scripts.osrt_model_tools'
    run python3  dlr_inference_example.py 
    Traceback (most recent call last):
      File "dlr_inference_example.py", line 17, in <module>
        from common_utils import *
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/common_utils.py", line 21, in <module>
        from scripts.osrt_model_tools.tflite_tools import tflite_model_opt as tflOpt
    ModuleNotFoundError: No module named 'scripts.osrt_model_tools'
    
    

    and after trying
    https://github.com/TexasInstruments/edgeai-benchmark/issues/8

    i got same error

  • Hi!

    Thanks for re-trying on a new Ubuntu 22.04 installation. I'm not sure why this doesn't work out of the box, all our examples should just work. I've re-assigned this thread to the person responsible for the script; I hope they can help shed some light into this. Perhaps something about your setup we haven't fully analyzed/understood.

    Regards, Andreas

  • Thanks for your reply,I look forward to hearing from you soon.

  • re-trying again and it worked,but i got
     

      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/backend/execute.py", line 81, in run_check
        proc = subprocess.run(cmd, **kwargs)
      File "/usr/lib/python3.10/subprocess.py", line 503, in run
        with Popen(*popenargs, **kwargs) as process:
      File "/usr/lib/python3.10/subprocess.py", line 971, in __init__
        self._execute_child(args, executable, preexec_fn, close_fds,
      File "/usr/lib/python3.10/subprocess.py", line 1863, in _execute_child
        raise child_exception_type(errno_num, err_msg, err_filename)
    FileNotFoundError: [Errno 2] No such file or directory: PosixPath('dot')
    
    The above exception was the direct cause of the following exception:
    
    Traceback (most recent call last):
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/tvm_dlr/tvm_compilation_onnx_example.py", line 163, in <module>
        mod, status = compiler.enable(mod, params, calib_input_list)
      File "/home/ubuntu/venv/lib/python3.10/site-packages/tvm/relay/backend/contrib/tidl/tidl.py", line 2298, in enable
        num_imported_sgs = tidl_import.import_relay_ir(mod, params, subgraph_tensors_list,
      File "/home/ubuntu/venv/lib/python3.10/site-packages/tvm/relay/backend/contrib/tidl/tidl.py", line 1505, in import_relay_ir
        visualize_relay_graph(module=mod, filename=self.temp_folder+'/relay.gv')
      File "/home/ubuntu/venv/lib/python3.10/site-packages/tvm/relay/backend/contrib/tidl/visualize.py", line 121, in visualize_relay_graph
        dot.render(filename)
      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/_tools.py", line 171, in wrapper
        return func(*args, **kwargs)
      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/rendering.py", line 122, in render
        rendered = self._render(*args, **kwargs)
      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/_tools.py", line 171, in wrapper
        return func(*args, **kwargs)
      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/backend/rendering.py", line 324, in render
        execute.run_check(cmd,
      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/backend/execute.py", line 84, in run_check
        raise ExecutableNotFound(cmd) from e
    graphviz.backend.execute.ExecutableNotFound: failed to execute PosixPath('dot'), make sure the Graphviz executables are on your systems' PATH
    Generating subgraph boundary tensors for calibration...
    Building graph on host for tensor data collection...
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    Running graph on host for tensor data collection...
    Importing subgraph into TIDL...
    Traceback (most recent call last):
      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/backend/execute.py", line 81, in run_check
        proc = subprocess.run(cmd, **kwargs)
      File "/usr/lib/python3.10/subprocess.py", line 503, in run
        with Popen(*popenargs, **kwargs) as process:
      File "/usr/lib/python3.10/subprocess.py", line 971, in __init__
        self._execute_child(args, executable, preexec_fn, close_fds,
      File "/usr/lib/python3.10/subprocess.py", line 1863, in _execute_child
        raise child_exception_type(errno_num, err_msg, err_filename)
    FileNotFoundError: [Errno 2] No such file or directory: PosixPath('dot')
    
    The above exception was the direct cause of the following exception:
    
    Traceback (most recent call last):
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/tvm_dlr/tvm_compilation_tflite_example.py", line 164, in <module>
        mod, status = compiler.enable(mod, params, calib_input_list)
      File "/home/ubuntu/venv/lib/python3.10/site-packages/tvm/relay/backend/contrib/tidl/tidl.py", line 2298, in enable
        num_imported_sgs = tidl_import.import_relay_ir(mod, params, subgraph_tensors_list,
      File "/home/ubuntu/venv/lib/python3.10/site-packages/tvm/relay/backend/contrib/tidl/tidl.py", line 1505, in import_relay_ir
        visualize_relay_graph(module=mod, filename=self.temp_folder+'/relay.gv')
      File "/home/ubuntu/venv/lib/python3.10/site-packages/tvm/relay/backend/contrib/tidl/visualize.py", line 121, in visualize_relay_graph
        dot.render(filename)
      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/_tools.py", line 171, in wrapper
        return func(*args, **kwargs)
      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/rendering.py", line 122, in render
        rendered = self._render(*args, **kwargs)
      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/_tools.py", line 171, in wrapper
        return func(*args, **kwargs)
      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/backend/rendering.py", line 324, in render
        execute.run_check(cmd,
      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/backend/execute.py", line 84, in run_check
        raise ExecutableNotFound(cmd) from e
    graphviz.backend.execute.ExecutableNotFound: failed to execute PosixPath('dot'), make sure the Graphviz executables are on your systems' PATH
    ../../../models/public/mobilenetv2-1.0.onnx
    Generating subgraph boundary tensors for calibration...
    Building graph on host for tensor data collection...
    Running graph on host for tensor data collection...
    Importing subgraph into TIDL...
    Traceback (most recent call last):
      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/backend/execute.py", line 81, in run_check
        proc = subprocess.run(cmd, **kwargs)
      File "/usr/lib/python3.10/subprocess.py", line 503, in run
        with Popen(*popenargs, **kwargs) as process:
      File "/usr/lib/python3.10/subprocess.py", line 971, in __init__
        self._execute_child(args, executable, preexec_fn, close_fds,
      File "/usr/lib/python3.10/subprocess.py", line 1863, in _execute_child
        raise child_exception_type(errno_num, err_msg, err_filename)
    FileNotFoundError: [Errno 2] No such file or directory: PosixPath('dot')
    
    The above exception was the direct cause of the following exception:
    
    Traceback (most recent call last):
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/tvm_dlr/tvm_compilation_onnx_example.py", line 163, in <module>
        mod, status = compiler.enable(mod, params, calib_input_list)
      File "/home/ubuntu/venv/lib/python3.10/site-packages/tvm/relay/backend/contrib/tidl/tidl.py", line 2298, in enable
        num_imported_sgs = tidl_import.import_relay_ir(mod, params, subgraph_tensors_list,
      File "/home/ubuntu/venv/lib/python3.10/site-packages/tvm/relay/backend/contrib/tidl/tidl.py", line 1505, in import_relay_ir
        visualize_relay_graph(module=mod, filename=self.temp_folder+'/relay.gv')
      File "/home/ubuntu/venv/lib/python3.10/site-packages/tvm/relay/backend/contrib/tidl/visualize.py", line 121, in visualize_relay_graph
        dot.render(filename)
      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/_tools.py", line 171, in wrapper
        return func(*args, **kwargs)
      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/rendering.py", line 122, in render
        rendered = self._render(*args, **kwargs)
      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/_tools.py", line 171, in wrapper
        return func(*args, **kwargs)
      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/backend/rendering.py", line 324, in render
        execute.run_check(cmd,
      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/backend/execute.py", line 84, in run_check
        raise ExecutableNotFound(cmd) from e
    graphviz.backend.execute.ExecutableNotFound: failed to execute PosixPath('dot'), make sure the Graphviz executables are on your systems' PATH
    Generating subgraph boundary tensors for calibration...
    Building graph on host for tensor data collection...
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    conv2d NHWC layout is not optimized for x86 with autotvm.
    Running graph on host for tensor data collection...
    Importing subgraph into TIDL...
    Traceback (most recent call last):
      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/backend/execute.py", line 81, in run_check
        proc = subprocess.run(cmd, **kwargs)
      File "/usr/lib/python3.10/subprocess.py", line 503, in run
        with Popen(*popenargs, **kwargs) as process:
      File "/usr/lib/python3.10/subprocess.py", line 971, in __init__
        self._execute_child(args, executable, preexec_fn, close_fds,
      File "/usr/lib/python3.10/subprocess.py", line 1863, in _execute_child
        raise child_exception_type(errno_num, err_msg, err_filename)
    FileNotFoundError: [Errno 2] No such file or directory: PosixPath('dot')
    
    The above exception was the direct cause of the following exception:
    
    Traceback (most recent call last):
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/tvm_dlr/tvm_compilation_tflite_example.py", line 164, in <module>
        mod, status = compiler.enable(mod, params, calib_input_list)
      File "/home/ubuntu/venv/lib/python3.10/site-packages/tvm/relay/backend/contrib/tidl/tidl.py", line 2298, in enable
        num_imported_sgs = tidl_import.import_relay_ir(mod, params, subgraph_tensors_list,
      File "/home/ubuntu/venv/lib/python3.10/site-packages/tvm/relay/backend/contrib/tidl/tidl.py", line 1505, in import_relay_ir
        visualize_relay_graph(module=mod, filename=self.temp_folder+'/relay.gv')
      File "/home/ubuntu/venv/lib/python3.10/site-packages/tvm/relay/backend/contrib/tidl/visualize.py", line 121, in visualize_relay_graph
        dot.render(filename)
      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/_tools.py", line 171, in wrapper
        return func(*args, **kwargs)
      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/rendering.py", line 122, in render
        rendered = self._render(*args, **kwargs)
      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/_tools.py", line 171, in wrapper
        return func(*args, **kwargs)
      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/backend/rendering.py", line 324, in render
        execute.run_check(cmd,
      File "/home/ubuntu/venv/lib/python3.10/site-packages/graphviz/backend/execute.py", line 84, in run_check
        raise ExecutableNotFound(cmd) from e
    graphviz.backend.execute.ExecutableNotFound: failed to execute PosixPath('dot'), make sure the Graphviz executables are on your systems' PATH
    ../../../models/public/mobilenetv3_large_100.onnx
    Traceback (most recent call last):
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/tvm_dlr/tvm_compilation_timm_example.py", line 56, in <module>
        cgt7x_bin = os.path.join(os.environ['CGT7X_ROOT'], 'bin', 'cl7x')
      File "/usr/lib/python3.10/os.py", line 680, in __getitem__
        raise KeyError(key) from None
    KeyError: 'CGT7X_ROOT'
    run python3  dlr_inference_example.py 
    
    
    Running Inference on Model -  ../../../model-artifacts/cl-dlr-tflite_inceptionnetv3
    
    2023-10-13 15:13:47,423 INFO Could not find libdlr.so in model artifact. Using dlr from /home/ubuntu/venv/lib/python3.10/site-packages/dlr/libdlr.so
    [15:13:47] /home/a0323918/tvm-j7/tvm_package_build/neo-ai-dlr/src/dlr.cc:343: Error: Unable to determine backend from path: '../../../model-artifacts/cl-dlr-tflite_inceptionnetv3'.
    2023-10-13 15:13:47,429 ERROR error in DLRModel instantiation 
    Traceback (most recent call last):
      File "/home/ubuntu/venv/lib/python3.10/site-packages/dlr/api.py", line 89, in __init__
        self._impl = DLRModelImpl(model_path, dev_type, dev_id, error_log_file, use_default_dlr)
      File "/home/ubuntu/venv/lib/python3.10/site-packages/dlr/dlr_model.py", line 79, in __init__
        self._check_call(self._lib.CreateDLRModel(byref(self.handle),
      File "/home/ubuntu/venv/lib/python3.10/site-packages/dlr/dlr_model.py", line 160, in _check_call
        raise DLRError(self._lib.DLRGetLastError().decode('ascii'))
    dlr.dlr_model.DLRError
    Traceback (most recent call last):
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/tvm_dlr/dlr_inference_example.py", line 199, in <module>
        model_create_and_run(model_output_directory, 'input',
      File "/home/ubuntu/edgeai-tidl-tools/examples/osrt_python/tvm_dlr/dlr_inference_example.py", line 164, in model_create_and_run
        model = DLRModel(model_dir, 'cpu')
      File "/home/ubuntu/venv/lib/python3.10/site-packages/dlr/api.py", line 92, in __init__
        raise ex
      File "/home/ubuntu/venv/lib/python3.10/site-packages/dlr/api.py", line 89, in __init__
        self._impl = DLRModelImpl(model_path, dev_type, dev_id, error_log_file, use_default_dlr)
      File "/home/ubuntu/venv/lib/python3.10/site-packages/dlr/dlr_model.py", line 79, in __init__
        self._check_call(self._lib.CreateDLRModel(byref(self.handle),
      File "/home/ubuntu/venv/lib/python3.10/site-packages/dlr/dlr_model.py", line 160, in _check_call
        raise DLRError(self._lib.DLRGetLastError().decode('ascii'))
    dlr.dlr_model.DLRError
    
    

  • Looks like there are two issues.

    1) FileNotFoundError: [Errno 2] No such file or directory: PosixPath('dot')

    Can you try the solution from here: github.com/.../36
    "sudo apt install python-pydot python-pydot-ng graphviz" 

    2) KeyError: 'CGT7X_ROOT'

    You need to install TI C7000 compiler and set CGT7X_ROOT to the installation path.  See here: https://github.com/TexasInstruments/edgeai-tidl-tools/blob/master/setup.sh#L374

    -Yuan