This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TDA4VM: YOLOv8 ONNX Compilation in TDA4VM Evaluation Board:

Part Number: TDA4VM


YOLOv8 ONNX Compilation in TDA4VM Evaluation Board:

I am following this link https://github.com/TexasInstruments/edgeai-tidl-tools/blob/master/docs/custom_model_evaluation.md#custom-model-evaluation to compile my trained yolov8 model and make it ready for inference on TDA4VM Board using realsense camera when i run the onnxrt_ep.py after making required changes for the trained onnx model in model_config.py I am facing the following issue


jayant_14@Jayant14:~/edgeai-tidl-tools/examples/osrt_python/ort$ python3 onnxrt_ep.py
Available execution providers : ['TIDLExecutionProvider', 'TIDLCompilationProvider', 'CPUExecutionProvider']

Running 5 Models - ['cl-ort-resnet18-v1', 'cl-ort-caffe_squeezenet_v1_1', 'ss-ort-deeplabv3lite_mobilenetv2', 'od-ort-ssd-lite_mobilenetv2_fpn', 'yolov8s6_640_onnx']


Running_Model : cl-ort-resnet18-v1


Running_Model : cl-ort-caffe_squeezenet_v1_1


Running_Model : ss-ort-deeplabv3lite_mobilenetv2


Running_Model : od-ort-ssd-lite_mobilenetv2_fpn


Running_Model : yolov8s6_640_onnx

libtidl_onnxrt_EP loaded 0x557c2fc3c440
2023-11-03 17:08:52.947156290 [E:onnxruntime:, inference_session.cc:1311 operator()] Exception during initialization: basic_string::_M_construct null not valid
Process Process-2:
Traceback (most recent call last):
File "/home/jayant_14/.pyenv/versions/3.10.13/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
self.run()
File "/home/jayant_14/.pyenv/versions/3.10.13/lib/python3.10/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/home/jayant_14/edgeai-tidl-tools/examples/osrt_python/ort/onnxrt_ep.py", line 194, in run_model
sess = rt.InferenceSession(config['model_path'] ,providers=EP_list, provider_options=[delegate_options, {}], sess_options=so)
File "/home/jayant_14/.pyenv/versions/3.10.13/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 283, in __init__
self._create_inference_session(providers, provider_options)
File "/home/jayant_14/.pyenv/versions/3.10.13/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 315, in _create_inference_session
sess.initialize_session(providers, provider_options)
onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Exception during initialization: basic_string::_M_construct null not valid
libtidl_onnxrt_EP loaded 0x557c2febe920
2023-11-03 17:08:53.135978865 [E:onnxruntime:, inference_session.cc:1311 operator()] Exception during initialization: std::bad_alloc
Process Process-3:
Traceback (most recent call last):
File "/home/jayant_14/.pyenv/versions/3.10.13/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
self.run()
File "/home/jayant_14/.pyenv/versions/3.10.13/lib/python3.10/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/home/jayant_14/edgeai-tidl-tools/examples/osrt_python/ort/onnxrt_ep.py", line 194, in run_model
sess = rt.InferenceSession(config['model_path'] ,providers=EP_list, provider_options=[delegate_options, {}], sess_options=so)
File "/home/jayant_14/.pyenv/versions/3.10.13/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 283, in __init__
self._create_inference_session(providers, provider_options)
File "/home/jayant_14/.pyenv/versions/3.10.13/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 315, in _create_inference_session
sess.initialize_session(providers, provider_options)
onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Exception during initialization: std::bad_alloc
libtidl_onnxrt_EP loaded 0x557c305611c0
Process Process-5:
Traceback (most recent call last):
File "/home/jayant_14/.pyenv/versions/3.10.13/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
self.run()
File "/home/jayant_14/.pyenv/versions/3.10.13/lib/python3.10/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/home/jayant_14/edgeai-tidl-tools/examples/osrt_python/ort/onnxrt_ep.py", line 194, in run_model
sess = rt.InferenceSession(config['model_path'] ,providers=EP_list, provider_options=[delegate_options, {}], sess_options=so)
File "/home/jayant_14/.pyenv/versions/3.10.13/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 283, in __init__
self._create_inference_session(providers, provider_options)
File "/home/jayant_14/.pyenv/versions/3.10.13/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 310, in _create_inference_session
sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from /home/jayant_14/edgeai-tidl-tools/models/public/openvino.onnx failed:/home/a0496663/work/edgeaitidltools/rel90/onnx/onnxruntime_bit/onnxruntime/onnxruntime/core/graph/model.cc:111 onnxruntime::Model::Model(onnx::ModelProto&&, const PathString&, const IOnnxRuntimeOpSchemaRegistryList*, const onnxruntime::logging::Logger&) Unknown model file format version.

2023-11-03 17:08:53.463571139 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer4.1.bn1.num_batches_tracked'. It is not used by any node and should be removed from the model.
2023-11-03 17:08:53.463782148 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer4.0.bn2.num_batches_tracked'. It is not used by any node and should be removed from the model.
2023-11-03 17:08:53.463793189 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer3.0.downsample.1.num_batches_tracked'. It is not used by any node and should be removed from the model.
2023-11-03 17:08:53.463800192 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer4.0.downsample.1.num_batches_tracked'. It is not used by any node and should be removed from the model.
2023-11-03 17:08:53.463806905 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer3.0.bn1.num_batches_tracked'. It is not used by any node and should be removed from the model.
2023-11-03 17:08:53.463816844 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer3.1.bn2.num_batches_tracked'. It is not used by any node and should be removed from the model.
2023-11-03 17:08:53.463829828 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer2.0.downsample.1.num_batches_tracked'. It is not used by any node and should be removed from the model.
2023-11-03 17:08:53.463841480 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer3.1.bn1.num_batches_tracked'. It is not used by any node and should be removed from the model.
2023-11-03 17:08:53.463851069 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer1.0.bn2.num_batches_tracked'. It is not used by any node and should be removed from the model.
2023-11-03 17:08:53.463862190 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer1.1.bn2.num_batches_tracked'. It is not used by any node and should be removed from the model.
2023-11-03 17:08:53.463875184 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'bn1.num_batches_tracked'. It is not used by any node and should be removed from the model.
2023-11-03 17:08:53.463887978 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer1.0.bn1.num_batches_tracked'. It is not used by any node and should be removed from the model.
2023-11-03 17:08:53.463898037 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer2.1.bn1.num_batches_tracked'. It is not used by any node and should be removed from the model.
2023-11-03 17:08:53.463962299 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer1.1.bn1.num_batches_tracked'. It is not used by any node and should be removed from the model.
2023-11-03 17:08:53.464002786 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer4.1.bn2.num_batches_tracked'. It is not used by any node and should be removed from the model.
2023-11-03 17:08:53.464043753 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer4.0.bn1.num_batches_tracked'. It is not used by any node and should be removed from the model.
2023-11-03 17:08:53.464102735 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer2.0.bn1.num_batches_tracked'. It is not used by any node and should be removed from the model.
2023-11-03 17:08:53.464171224 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer3.0.bn2.num_batches_tracked'. It is not used by any node and should be removed from the model.
2023-11-03 17:08:53.464211981 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer2.0.bn2.num_batches_tracked'. It is not used by any node and should be removed from the model.
2023-11-03 17:08:53.464253500 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer2.1.bn2.num_batches_tracked'. It is not used by any node and should be removed from the model.
libtidl_onnxrt_EP loaded 0x557c2fcda7a0
2023-11-03 17:08:53.581464433 [E:onnxruntime:, inference_session.cc:1311 operator()] Exception during initialization: basic_string::_M_construct null not valid
Process Process-4:
Traceback (most recent call last):
File "/home/jayant_14/.pyenv/versions/3.10.13/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
self.run()
File "/home/jayant_14/.pyenv/versions/3.10.13/lib/python3.10/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/home/jayant_14/edgeai-tidl-tools/examples/osrt_python/ort/onnxrt_ep.py", line 194, in run_model
sess = rt.InferenceSession(config['model_path'] ,providers=EP_list, provider_options=[delegate_options, {}], sess_options=so)
File "/home/jayant_14/.pyenv/versions/3.10.13/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 283, in __init__
self._create_inference_session(providers, provider_options)
File "/home/jayant_14/.pyenv/versions/3.10.13/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 315, in _create_inference_session
sess.initialize_session(providers, provider_options)
onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Exception during initialization: basic_string::_M_construct null not valid
2023-11-03 17:08:53.711376396 [E:onnxruntime:, inference_session.cc:1311 operator()] Exception during initialization: basic_string::_M_construct null not valid
Process Process-1:
Traceback (most recent call last):
File "/home/jayant_14/.pyenv/versions/3.10.13/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
self.run()
File "/home/jayant_14/.pyenv/versions/3.10.13/lib/python3.10/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/home/jayant_14/edgeai-tidl-tools/examples/osrt_python/ort/onnxrt_ep.py", line 194, in run_model
sess = rt.InferenceSession(config['model_path'] ,providers=EP_list, provider_options=[delegate_options, {}], sess_options=so)
File "/home/jayant_14/.pyenv/versions/3.10.13/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 283, in __init__
self._create_inference_session(providers, provider_options)
File "/home/jayant_14/.pyenv/versions/3.10.13/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 315, in _create_inference_session
sess.initialize_session(providers, provider_options)
onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Exception during initialization: basic_string::_M_construct null not valid


  • Hi,

    Can you share model compilation logs ?

  • I have added the changes for the trained YOLOv8 model in the model_configs.py and then ran the onnxrt_ep.py and got the logs mentioned in the previous question

  • jayant_14@Jayant14:~/edgeai-tidl-tools/examples/osrt_python/ort$ python3 onnxrt_ep.py

    From the above command, it appears to me that you are invoking call to inference session.

    For inference to happen, model artifacts should be present so that those will be consumed as part of inference process.

    So could you please help me with model compilation logs ?

    Please share the logs in text file

    Thank you

  • I am starting the whole process over, I am facing this issue when I am running the onnxrt_ep.py  after running the model_configs.py with already compiled model_zoo models of the base repository without adding the new custom model in the model artifacts.

    root@bba75ef9e626:/home/root/examples/osrt_python/ort# python3 onnxrt_ep.py 
    Available execution providers :  ['TIDLExecutionProvider', 'TIDLCompilationProvider', 'CPUExecutionProvider']
    
    Running 4 Models - ['cl-ort-resnet18-v1', 'cl-ort-caffe_squeezenet_v1_1', 'ss-ort-deeplabv3lite_mobilenetv2', 'od-ort-ssd-lite_mobilenetv2_fpn']
    
    
    Running_Model :  cl-ort-resnet18-v1  
    
    Downloading   ../../../models/public/resnet18_opset9.onnx
    
    Running_Model :  cl-ort-caffe_squeezenet_v1_1  
    
    Downloading   ../../../models/public/caffe_squeezenet_v1_1.prototxt
    
    Running_Model :  ss-ort-deeplabv3lite_mobilenetv2  
    
    Downloading   ../../../models/public/deeplabv3lite_mobilenetv2.onnx
    
    Running_Model :  od-ort-ssd-lite_mobilenetv2_fpn  
    
    Downloading   ../../../models/public/ssd-lite_mobilenetv2_fpn.onnx
    Downloading   ../../../models/public/caffe_squeezenet_v1_1.caffemodel
    Converted model is valid!
    Converted model is valid!
    Downloading   ../../../models/public/ssd-lite_mobilenetv2_fpn.prototxt
    libtidl_onnxrt_EP loaded 0x557b6b94c800 
    2023-12-05 19:30:01.996399452 [E:onnxruntime:, inference_session.cc:1311 operator()] Exception during initialization: std::bad_alloc
    Process Process-3:
    Traceback (most recent call last):
      File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
        self.run()
      File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
        self._target(*self._args, **self._kwargs)
      File "/home/root/examples/osrt_python/ort/onnxrt_ep.py", line 194, in run_model
        sess = rt.InferenceSession(config['model_path'] ,providers=EP_list, provider_options=[delegate_options, {}], sess_options=so)
      File "/usr/local/lib/python3.10/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 283, in __init__
        self._create_inference_session(providers, provider_options)
      File "/usr/local/lib/python3.10/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 315, in _create_inference_session
        sess.initialize_session(providers, provider_options)
    onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Exception during initialization: std::bad_alloc
    libtidl_onnxrt_EP loaded 0x557b6d2aa7c0 
    2023-12-05 19:30:02.573986764 [E:onnxruntime:, inference_session.cc:1311 operator()] Exception during initialization: basic_string::_M_create
    Process Process-4:
    Traceback (most recent call last):
      File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
        self.run()
      File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
        self._target(*self._args, **self._kwargs)
      File "/home/root/examples/osrt_python/ort/onnxrt_ep.py", line 194, in run_model
        sess = rt.InferenceSession(config['model_path'] ,providers=EP_list, provider_options=[delegate_options, {}], sess_options=so)
      File "/usr/local/lib/python3.10/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 283, in __init__
        self._create_inference_session(providers, provider_options)
      File "/usr/local/lib/python3.10/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 315, in _create_inference_session
        sess.initialize_session(providers, provider_options)
    onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Exception during initialization: basic_string::_M_create
    Converted model is valid!
    2023-12-05 19:30:03.992955447 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer4.1.bn1.num_batches_tracked'. It is not used by any node and should be removed from the model.
    2023-12-05 19:30:03.993018067 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer4.0.bn2.num_batches_tracked'. It is not used by any node and should be removed from the model.
    2023-12-05 19:30:03.993025845 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer3.0.downsample.1.num_batches_tracked'. It is not used by any node and should be removed from the model.
    2023-12-05 19:30:03.993032642 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer4.0.downsample.1.num_batches_tracked'. It is not used by any node and should be removed from the model.
    2023-12-05 19:30:03.993037354 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer3.0.bn1.num_batches_tracked'. It is not used by any node and should be removed from the model.
    2023-12-05 19:30:03.993043573 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer3.1.bn2.num_batches_tracked'. It is not used by any node and should be removed from the model.
    2023-12-05 19:30:03.993054154 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer2.0.downsample.1.num_batches_tracked'. It is not used by any node and should be removed from the model.
    2023-12-05 19:30:03.993058735 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer3.1.bn1.num_batches_tracked'. It is not used by any node and should be removed from the model.
    2023-12-05 19:30:03.993064982 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer1.0.bn2.num_batches_tracked'. It is not used by any node and should be removed from the model.
    2023-12-05 19:30:03.993071226 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer1.1.bn2.num_batches_tracked'. It is not used by any node and should be removed from the model.
    2023-12-05 19:30:03.993076145 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'bn1.num_batches_tracked'. It is not used by any node and should be removed from the model.
    2023-12-05 19:30:03.993081227 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer1.0.bn1.num_batches_tracked'. It is not used by any node and should be removed from the model.
    2023-12-05 19:30:03.993087948 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer2.1.bn1.num_batches_tracked'. It is not used by any node and should be removed from the model.
    2023-12-05 19:30:03.993092371 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer1.1.bn1.num_batches_tracked'. It is not used by any node and should be removed from the model.
    2023-12-05 19:30:03.993098166 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer4.1.bn2.num_batches_tracked'. It is not used by any node and should be removed from the model.
    2023-12-05 19:30:03.993102311 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer4.0.bn1.num_batches_tracked'. It is not used by any node and should be removed from the model.
    2023-12-05 19:30:03.993106752 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer2.0.bn1.num_batches_tracked'. It is not used by any node and should be removed from the model.
    2023-12-05 19:30:03.993113245 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer3.0.bn2.num_batches_tracked'. It is not used by any node and should be removed from the model.
    2023-12-05 19:30:03.993118883 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer2.0.bn2.num_batches_tracked'. It is not used by any node and should be removed from the model.
    2023-12-05 19:30:03.993125108 [W:onnxruntime:, graph.cc:3106 CleanUnusedInitializers] Removing initializer 'layer2.1.bn2.num_batches_tracked'. It is not used by any node and should be removed from the model.
    libtidl_onnxrt_EP loaded 0x557b6b4a9860 
    2023-12-05 19:30:04.126811855 [E:onnxruntime:, inference_session.cc:1311 operator()] Exception during initialization: basic_string::_M_construct null not valid
    Process Process-1:
    Traceback (most recent call last):
      File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
        self.run()
      File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
        self._target(*self._args, **self._kwargs)
      File "/home/root/examples/osrt_python/ort/onnxrt_ep.py", line 194, in run_model
        sess = rt.InferenceSession(config['model_path'] ,providers=EP_list, provider_options=[delegate_options, {}], sess_options=so)
      File "/usr/local/lib/python3.10/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 283, in __init__
        self._create_inference_session(providers, provider_options)
      File "/usr/local/lib/python3.10/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 315, in _create_inference_session
        sess.initialize_session(providers, provider_options)
    onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Exception during initialization: basic_string::_M_construct null not valid
    caffemodel was successfully loaded
    add model input information
    add model output information and model intermediate output information
    *.onnx model conversion completed
    removing not constant initializers from model
    frozen graph has been created
    the model has been successfully saved to ../../../models/public/caffe_squeezenet_v1_1.onnx
    Converted model is valid!
    libtidl_onnxrt_EP loaded 0x557b6b744c90 
    2023-12-05 19:30:04.704848791 [E:onnxruntime:, inference_session.cc:1311 operator()] Exception during initialization: basic_string::_M_construct null not valid
    Process Process-2:
    Traceback (most recent call last):
      File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
        self.run()
      File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
        self._target(*self._args, **self._kwargs)
      File "/home/root/examples/osrt_python/ort/onnxrt_ep.py", line 194, in run_model
        sess = rt.InferenceSession(config['model_path'] ,providers=EP_list, provider_options=[delegate_options, {}], sess_options=so)
      File "/usr/local/lib/python3.10/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 283, in __init__
        self._create_inference_session(providers, provider_options)
      File "/usr/local/lib/python3.10/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 315, in _create_inference_session
        sess.initialize_session(providers, provider_options)
    onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Exception during initialization: basic_string::_M_construct null not valid
    
    
    

    And faced these issue for running python3 tflrt_delegate.py with the base models of the base repo.

    oot@bba75ef9e626:/home/root/examples/osrt_python/tfl# python3 tflrt_delegate.py 
    Running 4 Models - ['cl-tfl-mobilenet_v1_1.0_224', 'ss-tfl-deeplabv3_mnv2_ade20k_float', 'od-tfl-ssd_mobilenet_v2_300_float', 'od-tfl-ssdlite_mobiledet_dsp_320x320_coco']
    
    
    Running_Model :  cl-tfl-mobilenet_v1_1.0_224
    Downloading   ../../../models/public/mobilenet_v1_1.0_224.tflite
    
    Running_Model :  ss-tfl-deeplabv3_mnv2_ade20k_float
    Downloading   ../../../models/public/deeplabv3_mnv2_ade20k_float.tflite
    
    Running_Model :  od-tfl-ssd_mobilenet_v2_300_float
    Downloading   ../../../models/public/ssd_mobilenet_v2_300_float.tflite
    
    Running_Model :  od-tfl-ssdlite_mobiledet_dsp_320x320_coco
    Downloading   ../../../models/public/ssdlite_mobiledet_dsp_320x320_coco_20200519.tflite
    /home/root/models/public/ssdlite_mobiledet_dsp_320x320_coco_20200519.tflite
    Process Process-4:
    Traceback (most recent call last):
      File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
        self.run()
      File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
        self._target(*self._args, **self._kwargs)
      File "/home/root/examples/osrt_python/tfl/tflrt_delegate.py", line 133, in run_model
        download_model(models_configs, model)
      File "/home/root/examples/osrt_python/common_utils.py", line 206, in download_model
        tflOpt.tidlTfliteModelOptimize(abs_path,abs_path, scale, mean)
      File "/home/root/scripts/osrt_model_tools/tflite_tools/tflite_model_opt.py", line 113, in tidlTfliteModelOptimize
        modelT = tflite_model.Model.ModelT.InitFromObj(model)
      File "/home/root/scripts/osrt_model_tools/tflite_tools/tflite_model/Model.py", line 212, in InitFromObj
        x._UnPack(model)
      File "/home/root/scripts/osrt_model_tools/tflite_tools/tflite_model/Model.py", line 219, in _UnPack
        self.version = model.Version()
      File "/home/root/scripts/osrt_model_tools/tflite_tools/tflite_model/Model.py", line 28, in Version
        o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4))
      File "/usr/local/lib/python3.10/dist-packages/flatbuffers/table.py", line 37, in Offset
        vtable = self.Pos - self.Get(N.SOffsetTFlags, self.Pos)
      File "/usr/local/lib/python3.10/dist-packages/flatbuffers/table.py", line 93, in Get
        return flags.py_type(encode.Get(flags.packer_type, self.Bytes, off))
      File "/usr/local/lib/python3.10/dist-packages/flatbuffers/encode.py", line 26, in Get
        return packer_type.unpack_from(memoryview_type(buf), head)[0]
    struct.error: unpack_from requires a buffer of at least 544501586 bytes for unpacking 4 bytes at offset 544501582 (actual buffer size is 10)
    /home/root/models/public/deeplabv3_mnv2_ade20k_float.tflite
    Process Process-2:
    Traceback (most recent call last):
      File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
        self.run()
      File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
        self._target(*self._args, **self._kwargs)
      File "/home/root/examples/osrt_python/tfl/tflrt_delegate.py", line 133, in run_model
        download_model(models_configs, model)
      File "/home/root/examples/osrt_python/common_utils.py", line 206, in download_model
        tflOpt.tidlTfliteModelOptimize(abs_path,abs_path, scale, mean)
      File "/home/root/scripts/osrt_model_tools/tflite_tools/tflite_model_opt.py", line 113, in tidlTfliteModelOptimize
        modelT = tflite_model.Model.ModelT.InitFromObj(model)
      File "/home/root/scripts/osrt_model_tools/tflite_tools/tflite_model/Model.py", line 212, in InitFromObj
        x._UnPack(model)
      File "/home/root/scripts/osrt_model_tools/tflite_tools/tflite_model/Model.py", line 219, in _UnPack
        self.version = model.Version()
      File "/home/root/scripts/osrt_model_tools/tflite_tools/tflite_model/Model.py", line 28, in Version
        o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4))
      File "/usr/local/lib/python3.10/dist-packages/flatbuffers/table.py", line 37, in Offset
        vtable = self.Pos - self.Get(N.SOffsetTFlags, self.Pos)
      File "/usr/local/lib/python3.10/dist-packages/flatbuffers/table.py", line 93, in Get
        return flags.py_type(encode.Get(flags.packer_type, self.Bytes, off))
      File "/usr/local/lib/python3.10/dist-packages/flatbuffers/encode.py", line 26, in Get
        return packer_type.unpack_from(memoryview_type(buf), head)[0]
    struct.error: unpack_from requires a buffer of at least 544501586 bytes for unpacking 4 bytes at offset 544501582 (actual buffer size is 10)
    /home/root/models/public/ssd_mobilenet_v2_300_float.tflite
    Process Process-3:
    Traceback (most recent call last):
      File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
        self.run()
      File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
        self._target(*self._args, **self._kwargs)
      File "/home/root/examples/osrt_python/tfl/tflrt_delegate.py", line 133, in run_model
        download_model(models_configs, model)
      File "/home/root/examples/osrt_python/common_utils.py", line 206, in download_model
        tflOpt.tidlTfliteModelOptimize(abs_path,abs_path, scale, mean)
      File "/home/root/scripts/osrt_model_tools/tflite_tools/tflite_model_opt.py", line 113, in tidlTfliteModelOptimize
        modelT = tflite_model.Model.ModelT.InitFromObj(model)
      File "/home/root/scripts/osrt_model_tools/tflite_tools/tflite_model/Model.py", line 212, in InitFromObj
        x._UnPack(model)
      File "/home/root/scripts/osrt_model_tools/tflite_tools/tflite_model/Model.py", line 219, in _UnPack
        self.version = model.Version()
      File "/home/root/scripts/osrt_model_tools/tflite_tools/tflite_model/Model.py", line 28, in Version
        o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4))
      File "/usr/local/lib/python3.10/dist-packages/flatbuffers/table.py", line 37, in Offset
        vtable = self.Pos - self.Get(N.SOffsetTFlags, self.Pos)
      File "/usr/local/lib/python3.10/dist-packages/flatbuffers/table.py", line 93, in Get
        return flags.py_type(encode.Get(flags.packer_type, self.Bytes, off))
      File "/usr/local/lib/python3.10/dist-packages/flatbuffers/encode.py", line 26, in Get
        return packer_type.unpack_from(memoryview_type(buf), head)[0]
    struct.error: unpack_from requires a buffer of at least 544501586 bytes for unpacking 4 bytes at offset 544501582 (actual buffer size is 10)
    /home/root/models/public/mobilenet_v1_1.0_224.tflite
    Process Process-1:
    Traceback (most recent call last):
      File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
        self.run()
      File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
        self._target(*self._args, **self._kwargs)
      File "/home/root/examples/osrt_python/tfl/tflrt_delegate.py", line 133, in run_model
        download_model(models_configs, model)
      File "/home/root/examples/osrt_python/common_utils.py", line 206, in download_model
        tflOpt.tidlTfliteModelOptimize(abs_path,abs_path, scale, mean)
      File "/home/root/scripts/osrt_model_tools/tflite_tools/tflite_model_opt.py", line 113, in tidlTfliteModelOptimize
        modelT = tflite_model.Model.ModelT.InitFromObj(model)
      File "/home/root/scripts/osrt_model_tools/tflite_tools/tflite_model/Model.py", line 212, in InitFromObj
        x._UnPack(model)
      File "/home/root/scripts/osrt_model_tools/tflite_tools/tflite_model/Model.py", line 219, in _UnPack
        self.version = model.Version()
      File "/home/root/scripts/osrt_model_tools/tflite_tools/tflite_model/Model.py", line 28, in Version
        o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4))
      File "/usr/local/lib/python3.10/dist-packages/flatbuffers/table.py", line 37, in Offset
        vtable = self.Pos - self.Get(N.SOffsetTFlags, self.Pos)
      File "/usr/local/lib/python3.10/dist-packages/flatbuffers/table.py", line 93, in Get
        return flags.py_type(encode.Get(flags.packer_type, self.Bytes, off))
      File "/usr/local/lib/python3.10/dist-packages/flatbuffers/encode.py", line 26, in Get
        return packer_type.unpack_from(memoryview_type(buf), head)[0]
    struct.error: unpack_from requires a buffer of at least 544501586 bytes for unpacking 4 bytes at offset 544501582 (actual buffer size is 10)
    ^[C^CTraceback (most recent call last):
      File "/home/root/examples/osrt_python/tfl/tflrt_delegate.py", line 278, in <module>
        nthreads = join_one(nthreads)
      File "/home/root/examples/osrt_python/tfl/tflrt_delegate.py", line 260, in join_one
        sem.acquire()
    KeyboardInterrupt
    



    Please help me resolve this 

  • Hi,

    As i see from above logs, this appears to me like you are calling osrt inference directly.

    You can compile the model to generate the model artifacts and then call the inference session post you have model artifacts generated.

    You can read more about the same here : https://github.com/TexasInstruments/edgeai-tidl-tools/blob/master/examples/osrt_python/README.md#model-compilation

    I would recommend to try the standard models first and make solid understanding on the supported features , and gradually migrate to the custom model.