This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

PROCESSOR-SDK-AM62A: edgeai-tidl-tool compile error

Part Number: PROCESSOR-SDK-AM62A

Tool/software:

Hi,

I'm trying to compile yolov8 custom model.
I can generate onnx and prototxt files and I'm trying to complie these models under edgeai-tidl-tools.
However, I'm facing following error.

1. Envirnment.
SDK : 9.02.00.05
Ubuntu : 22.04
edgeai-tidl-tools : Tag "09_02_09_00"

2. Error info

Is it possible to send information to solve this issue ?
(I have tried to install "onnxruntime" independently and re-tried compiling but other errror occured..) 

Best Regards,

  • Hello Machida-san,

    This is not a typical error during edgeai-tidl-tools usage. Based on your note, it is probably a versioning issue on ONNXRuntime.

    Please supply the result of the following command so I can see your dependency versions on the Ubuntu 22.04 machine:

    pip3 freeze | grep -i "onnx"

    For example, my python3.10 virtual environment for TIDL 9.2 looks like the following

    caffe2onnx==1.0.2
    onnx==1.13.0
    onnx-opcounter==0.0.3
    onnx_graphsurgeon @ git+https://github.com/NVIDIA/TensorRT@68b5072fdb9df6b6edab1392b02a705394b2e906#subdirectory=tools/onnx-graphsurgeon
    onnxruntime-tidl @ file:///home/reese/1-edgeai/1-ti-tools/1-tidl-tools/10.0-tidl-tools/onnxruntime_tidl-1.14.0%2B10000000-cp310-cp310-linux_x86_64.whl#sha256=5efb894e39d3ca988e0644a1d0e9e34eab34c1a1f374d0085b9900febbb9724d
    onnxsim==0.4.35
    -e git+https://github.com/TexasInstruments/edgeai-tidl-tools@b7b07738bcd9afc7f74580217e81c307668a84ed#egg=tidl_onnx_model_optimizer&subdirectory=scripts/osrt_model_tools/onnx_tools/tidl-onnx-model-optimizer
    
    

    Yours should have only an onnxruntime-tidl, and not an ordinary onnxruntime. 

    You can also try running the offending command in a python REPL:

    >>> import onnxruntime
    
    >>> onnxruntime.SessionOptions
    
    <class 'onnxruntime.capi.onnxruntime_pybind11_state.SessionOptions'>
    
    >>> onnxruntime.SessionOptions()
    
    <onnxruntime.capi.onnxruntime_pybind11_state.SessionOptions object at 0x78beae5e7fb0>
    
    
    

  • Hi,

    Thank you for your reply.
    I tried grep command. Here is result.
    ---
    root@a5bf73fb698c:/home/root# pip3 freeze | grep -i "onnx"
    caffe2onnx @ github.com/.../tidl.zip
    onnx==1.13.0
    onnx_graphsurgeon @ git+github.com/.../TensorRT@68b5072fdb9df6b6edab1392b02a705394b2e906
    onnx_opcounter==0.0.4
    onnxruntime-tidl @ software-dl.ti.com/.../onnxruntime_tidl-1.14.0-cp310-cp310-linux_x86_64.whl
    onnxsim==0.4.35
    # Editable install with no version control (tidl_onnx_model_optimizer==9.2.0)
    -e /home/root/scripts/osrt_model_tools/onnx_tools/tidl-onnx-model-optimizer
    ---
    Note: "onnx_opcounter" is NOT installed in default, so I added this by using "pip install "onnx-opcounter==0.0.4"

    And also, here is result of python command.

    ---
    root@a5bf73fb698c:/home/root# python3
    Python 3.10.12 (main, Nov 6 2024, 20:22:13) [GCC 11.4.0] on linux
    Type "help", "copyright", "credits" or "license" for more information.
    >>> import onnxruntime
    >>> onnxruntime.SessionOptions
    Traceback (most recent call last):
    File "<stdin>", line 1, in <module>
    AttributeError: module 'onnxruntime' has no attribute 'SessionOptions'
    >>> exit()
    ---

    BR,

  • Hi,
    Here is additional information.
    When I install "onnxruntime" individually (install 1.20.1), I can avoid previous error, but I got following EP error.

    ---
    .
    .
    .
    ************ in ~tidlDelegate ************
     ************ in TIDL_subgraphRtDelete ************
     TIDL_deactivate is called with handle : b208e000
    PREEMPTION: Removing priroty object with handle = 0x7aaeb208e000 and targetPriority = 0,      Number of obejcts left are = 0, removed object with base  = 0x7aaeb268e000 and size =128
    MEM: Deinit ... !!!
    MEM: Alloc's: 25 alloc's of 128638014 bytes
    MEM: Free's : 25 free's  of 128638014 bytes
    MEM: Open's : 0 allocs  of 0 bytes
    MEM: Deinit ... Done !!!
    Available execution providers :  ['AzureExecutionProvider', 'CPUExecutionProvider']

    Running 1 Models - ['yolov8_s_syncbn_fast_8xb16-100e_coco']


    Running_Model :  yolov8_s_syncbn_fast_8xb16-100e_coco  


    Running shape inference on model ../../../models/public/best_coco_bbox_mAP_epoch_97.onnx

    /usr/local/lib/python3.10/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py:115: UserWarning: Specified provider 'TIDLCompilationProvider' is not in available provider names.Available providers: 'AzureExecutionProvider, CPUExecutionProvider'
      warnings.warn(
    *************** EP Error ***************
    EP Error Unknown Provider Type: TIDLCompilationProvider when using ['TIDLCompilationProvider', 'CPUExecutionProvider']
    Falling back to ['CPUExecutionProvider'] and retrying.
    ****************************************
    Process Process-1:
    Traceback (most recent call last):
      File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
        self.run()
      File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
        self._target(*self._args, **self._kwargs)
      File "/home/root/examples/osrt_python/ort/onnxrt_ep.py", line 239, in run_model
        imgs, output, proc_time, sub_graph_time, height, width  = infer_image(sess, input_images, config)
      File "/home/root/examples/osrt_python/ort/onnxrt_ep.py", line 135, in infer_image
        copy_time, sub_graphs_proc_time, totaltime = get_benchmark_output(sess)
      File "/home/root/examples/osrt_python/ort/onnxrt_ep.py", line 84, in get_benchmark_output
        benchmark_dict = interpreter.get_TI_benchmark_data()
    AttributeError: 'InferenceSession' object has no attribute 'get_TI_benchmark_data'
    ---

    BR,

  • Hello,

    AttributeError: 'InferenceSession' object has no attribute 'get_TI_benchmark_data'

    Yes, that would be true for upstream / main pip install target of onnxruntime. Your installation needs to be using onnxruntime-tidl instead. 

    Can you try deleting the tidl_tools directory in your edgeai-tidl-tools, uninstall all versions of onnx / onnxruntime, and retry running the "setup.sh" installation script? Please also ensure that the TIDL_TOOLS_PATH environment variable is not set, since the setup.sh may skip the install if that is already set.

    Note: "onnx_opcounter" is NOT installed in default, so I added this by using "pip install "onnx-opcounter==0.0.4"

    No issue there, this is a package I must have installed for my own testing. It is not needed by TIDL.

    Otherwise, I cannot reproduce your error. I setup a clean virtual environment with python and downloaded the .WHL file from the link in your 'pip freeze' output. When I install that, the onnxruntime.SessionOptions function is present. I will suggest again to do a clean installation of these packages in a virtual/docker environment.

    • I note in your logs that you are 'root' user of an odd hostname. Perhaps this is docker already? If so, please describe how you setup the docker for this project.

    BR,
    Reese

  • Hi Reese-san,

    I tried uninstall onnx packages and re-run setup.sh. However, I got error when I perform "cmake ../examples && make -j && cd .." command.
    So, I cleaned my environment and re-create docker environment.
    Then I could generate files.

    >Perhaps this is docker already? If so, please describe how you setup the docker for this project.
    Yes. My environment is under docker container. I followed following repo to setup docker environment.
    https://github.com/TexasInstruments/edgeai-tidl-tools/blob/09_02_09_00/docs/advanced_setup.md#docker-based-setup-for-x86_pc

    For now, I could generate files so I will close this thread, however I am facing other issue when I run models under AM62A EVM.
    so please continue to follow up following thread.
    https://e2e.ti.com/support/processors-group/processors/f/processors-forum/1459062/processor-sdk-am62a-yolov8-deployment-issue

    Best Regards,