This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

EDGE-AI-STUDIO: Need clarification on making model using edgeai-modelmaker

Part Number: EDGE-AI-STUDIO

Tool/software:

Hi Experts,

I am trying to make custom model using https://github.com/TexasInstruments/edgeai-modelmaker. I can't run run_convert_dataset.sh seems error at first seems that it can't locate an image so I deleted that image and annotation from json file. I only uncommented the below code in that script.

python3 ./scripts/convert_dataset.py --source_format=labelstudio_classification --source_anno=./data/sample/sample.json --source_data=./data/sample/new --dest_anno=./data/sample/labels.json
 

But on running it seems errors that is like below.

argv: ['./scripts/convert_dataset.py', '--source_format=labelstudio_classification', '--source_anno=./data/sample/sample.json', '--source_data=./data/sample/new', '--dest_anno=./data/sample/labels.json']
Traceback (most recent call last):
  File "/home/user/modelmaker/edgeai-modelmaker/./scripts/convert_dataset.py", line 414, in <module>
    main(args)
  File "/home/user/modelmaker/edgeai-modelmaker/./scripts/convert_dataset.py", line 378, in main
    convert_labelstudio_classification(args)
  File "/home/user/modelmaker/edgeai-modelmaker/./scripts/convert_dataset.py", line 248, in convert_labelstudio_classification
    dataset_json_min = json.load(afp)
  File "/home/user/.pyenv/versions/3.10.16/lib/python3.10/json/__init__.py", line 293, in load
    return loads(fp.read(),
  File "/home/user/.pyenv/versions/3.10.16/lib/python3.10/json/__init__.py", line 335, in loads
    raise JSONDecodeError("Unexpected UTF-8 BOM (decode using utf-8-sig)",
json.decoder.JSONDecodeError: Unexpected UTF-8 BOM (decode using utf-8-sig): line 1 column 1 (char 0)

How to solve this issue ? 

Steps I did :
Downloaded and Installed all required things from github page.
Annotated some images with label studio and exported it into JSON-MIN format.
Images used and json file is copied into the path /data/sample in modelmaker directory.
Then tried to run run_convert_dataset.sh.

Please help me to solve this issue.

Best Regards,
Sajan

  • Hi,
    Can I use json files in exported trained model from model composer.
    Regards,
    Sajan 

  • Hi Sajan,

    You can use a dataset exported from Model Composer as a dataset input for Model Maker. 

    I am going to direct this to the team that support Model Maker to address the error that you came across. 

    Martin

  • Hello Martin,

    Can you please connect the support team as soon as possible.

    Best Regards,
    Sajan

  • Hello Manu,
    Thanks for your reply. Previous days I saw that, Need label studio for annotation.
    But in this repository I can't find that step.

    Best Regards,
    Sajan

  • One more thing, I had annotated a large image set in model composer. Can I export that trained model to model maker

  • Yes - you can download the dataset that you annotated in model composer and use it in modelmaker.

  • seems errors when running ./setup_cpu.sh 

    ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
    openxlab 0.1.2 requires setuptools~=60.2.0, but you have setuptools 73.0.0 which is incompatible.
    
    ERROR: Cannot install -r requirements_freeze.txt (line 55), -r requirements_freeze.txt (line 80) and onnx==1.14.0 because these package versions have conflicting dependencies.
    
    ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts

    Thanks, Now it is epoch 15 and training continues still setup_cpu.sh having some error with version conflict. Could I use model created in ./data/projects to AM62A opt/model_zoo/<directory> or any other process needed after the training.
    I didn't explicitly said sdk version is 10.1. Is this model is compatible with sdk 10.1. In config or anywhere I didn't seem sdk version

    Best Regards,
    Sajan

  • Hi,

    After successful training. Compilation has some errors.

    (py310) user@M122DTRV:~/modelmaker/edgeai-tensorlab/edgeai-modelmaker$ ./run_modelmaker.sh AM62A config_classification.yaml 
    Number of AVX cores detected in PC: 12
    AVX compilation speedup in PC     : 1
    Target device                     : AM62A
    PYTHONPATH                        : .:
    TIDL_TOOLS_PATH                   : ../edgeai-benchmark/tools/tidl_tools_package/AM62A/tidl_tools
    LD_LIBRARY_PATH                   : ../edgeai-benchmark/tools/tidl_tools_package/AM62A/tidl_tools:
    argv: ['./scripts/run_modelmaker.py', 'config_classification.yaml', '--target_device', 'AM62A']
    ---------------------------------------------------------------------
    INFO: ModelMaker - task_type:classification model_name:regnet_x_800mf dataset_name:DMS run_name:20250401-123741/regnet_x_800mf
    - Model: regnet_x_800mf
    - TargetDevices & Estimated Inference Times (ms): {'TDA4VM': 2.95, 'AM62A': 5.95, 'AM67A': '5.95 (with 1/2 device capability)', 'AM68A': 2.92, 'AM69A': '2.85 (with 1/4th device capability)'}
    - This model can be compiled for the above device(s).
    ---------------------------------------------------------------------
    downloading from https://download.pytorch.org/models/regnet_x_800mf-ad17e45c.pth to ./data/downloads/pretrained/torch/hub/checkpoints/regnet_x_800mf-ad17e45c.pth
    100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 29302593/29302593 [00:01<00:00, 15028495.50B/s]
    assuming the given download_url is a valid path: /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/datasets/DMS
    INFO: ModelMaker - dataset split sizes {'train': 8799, 'val': 2253}
    INFO: ModelMaker - max_num_files is set to: 10000
    INFO: ModelMaker - dataset split sizes are limited to: {'train': 8000, 'val': 2000}
    INFO: ModelMaker - dataset loading OK
    INFO: ModelMaker - run params is at: /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250401-123741/regnet_x_800mf/run.yaml
    INFO: ModelMaker - running training - for detailed info see the log file: /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250401-123741/regnet_x_800mf/training/run.log
    TASKS TOTAL=1, NUM_RUNNING=1:   0%|                                                        | 0/1 [2:15:45<?, ?it/s, postfix={'RUNNING': ['20250401-123741/regnet_x_800mf:training'], 'COMPLETED': []}]TASKS TOTAL=1, NUM_RUNNING=1:   0%|                                                        | 0/1 [5:16:51<?, ?it/s, postfix={'RUNNING': ['20250401-123741/regnet_x_800mf:training'], 'COMPLETED': []}]
    TASKS TOTAL=1, NUM_RUNNING=0: 100%|██████████████████████████████████████████████████████████████████████| 1/1 [8:45:37<00:00, 31537.33s/it, postfix={'RUNNING': [], 'COMPLETED': ['regnet_x_800mf']}]
    Trained model is at: /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250401-123741/regnet_x_800mf/training
    
    SUCCESS: ModelMaker - Training completed.
    INFO: ModelMaker - running compilation - for detailed info see the log file: /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250401-123741/regnet_x_800mf/compilation/AM62A/work/cl-6170/run.log
    
    INFO:20250401-212324: number of configs - 1
    TASKS TOTAL=1, NUM_RUNNING=1:   0%|                                                                                   | 0/1 [00:02<?, ?it/s, postfix={'RUNNING': ['cl-6170:import'], 'COMPLETED': []}]
    ERROR:20250401-212327: model_id:cl-6170 run_import:True run_inference:False - [ONNXRuntimeError] : 1 : FAIL : Non-zero status code returned while running TIDL_0 node. Name:'TIDLExecutionProvider_TIDL_0_0' Status Message: TIDL Compute Import Failed.
    Traceback (most recent call last):
      File "/home/user/modelmaker/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/pipelines/pipeline_runner.py", line 273, in _run_pipeline
        result = cls._run_pipeline_impl(settings, pipeline_config, description)
      File "/home/user/modelmaker/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/pipelines/pipeline_runner.py", line 308, in _run_pipeline_impl
        result = accuracy_pipeline(description)
      File "/home/user/modelmaker/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/pipelines/accuracy_pipeline.py", line 79, in __call__
        param_result = self._run(description=description)
      File "/home/user/modelmaker/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/pipelines/accuracy_pipeline.py", line 104, in _run
        self._import_model(description)
      File "/home/user/modelmaker/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/pipelines/accuracy_pipeline.py", line 168, in _import_model
        output, info_dict = session.run_import(input_data, info_dict)
      File "/home/user/modelmaker/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/sessions/onnxrt_session.py", line 62, in run_import
        output = ONNXRuntimeWrapper.run_import(self, input_data)
      File "/home/user/modelmaker/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/core/onnxrt_runtime.py", line 58, in run_import
        output = self._run(input_data, output_keys)
      File "/home/user/modelmaker/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/core/onnxrt_runtime.py", line 91, in _run
        outputs = self.interpreter.run(output_keys, input_data)
      File "/home/user/.pyenv/versions/py310/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 217, in run
        return self._sess.run(output_names, input_feed, run_options)
    onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Non-zero status code returned while running TIDL_0 node. Name:'TIDLExecutionProvider_TIDL_0_0' Status Message: TIDL Compute Import Failed.
    WARNING: terminating the process - cl-6170:import - Status Message: TIDL Compute Import Failed
    TASKS TOTAL=1, NUM_RUNNING=1:   0%|                                                                                    | 0/1 [00:04<?, ?it/s, postfix={'RUNNING': ['cl-6170:infer'], 'COMPLETED': []}]
    ERROR:20250401-212331: Error occurred: cl-6170:infer - Error Code: -11 at /home/user/modelmaker/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/utils/parallel_runner.py
    WARNING: terminating the process - cl-6170:infer - Status Message: TIDL Compute Import Failed
    TASKS TOTAL=1, NUM_RUNNING=0: 100%|██████████████████████████████████████████████████████████████████████████████████| 1/1 [00:07<00:00,  2.00s/it, postfix={'RUNNING': [], 'COMPLETED': ['cl-6170']}]
    WARNING: Benchmark - completed: 0/1
    TASKS TOTAL=1, NUM_RUNNING=0: 100%|██████████████████████████████████████████████████████████████████████████████████| 1/1 [00:07<00:00,  7.28s/it, postfix={'RUNNING': [], 'COMPLETED': ['cl-6170']}]
    INFO: packaging artifacts to /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250401-123741/regnet_x_800mf/compilation/AM62A/pkg please wait...
    WARNING:20250401-212331: could not package - /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250401-123741/regnet_x_800mf/compilation/AM62A/work/cl-6170
    Traceback (most recent call last):
      File "/home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/./scripts/run_modelmaker.py", line 153, in <module>
        main(config)
      File "/home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/./scripts/run_modelmaker.py", line 88, in main
        model_runner.run()
      File "/home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/edgeai_modelmaker/ai_modules/vision/runner.py", line 193, in run
        self.model_compilation.run()
      File "/home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/edgeai_modelmaker/ai_modules/vision/compilation/edgeai_benchmark.py", line 163, in run
        edgeai_benchmark.interfaces.package_artifacts(self.settings, self.work_dir, out_dir=self.package_dir, custom_model=True)
      File "/home/user/modelmaker/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/interfaces/run_package.py", line 271, in package_artifacts
        with open(os.path.join(out_dir,'artifacts.yaml'), 'w') as fp:
    FileNotFoundError: [Errno 2] No such file or directory: '/home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250401-123741/regnet_x_800mf/compilation/AM62A/pkg/artifacts.yaml'

    From run.log

    2025-04-01 21:23:27.065009786 [E:onnxruntime:, sequential_executor.cc:514 ExecuteKernel] Non-zero status code returned while running TIDL_0 node. Name:'TIDLExecutionProvider_TIDL_0_0' Status Message: TIDL Compute Import Failed.


    One more question, Can I run compilation, skipping training that is done successfully, by modifying config_classification.yaml or any file.

    Best Regards,
    Sajan

  • Hi,

    I upgraded onnxruntime to 1.21.0. Now the error is with TIDLCompilationProvider. On search it seems that need two exports to edgeai-tivox-modules, but there is no directory in edgeai-tensorlab. Detailed log is below.

    Terminal Print
    
    SUCCESS: ModelMaker - Training completed.
    INFO: ModelMaker - running compilation - for detailed info see the log file: /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250402-172810/regnet_x_800mf/compilation/AM62A/work/cl-6170/run.log
    
    INFO:20250402-173443: number of configs - 1
    TASKS TOTAL=1, NUM_RUNNING=1:   0%|                                                                                    | 0/1 [00:21<?, ?it/s, postfix={'RUNNING': ['cl-6170:infer'], 'COMPLETED': []}]
    ERROR:20250402-173507: model_id:cl-6170 run_import:False run_inference:True - unsupported operand type(s) for *: 'NoneType' and 'float'
    Traceback (most recent call last):
      File "/home/user/modelmaker/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/pipelines/pipeline_runner.py", line 273, in _run_pipeline
        result = cls._run_pipeline_impl(settings, pipeline_config, description)
      File "/home/user/modelmaker/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/pipelines/pipeline_runner.py", line 308, in _run_pipeline_impl
        result = accuracy_pipeline(description)
      File "/home/user/modelmaker/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/pipelines/accuracy_pipeline.py", line 79, in __call__
        param_result = self._run(description=description)
      File "/home/user/modelmaker/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/pipelines/accuracy_pipeline.py", line 128, in _run
        output_list = self._infer_frames(description)
      File "/home/user/modelmaker/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/pipelines/accuracy_pipeline.py", line 250, in _infer_frames
        self.infer_stats_dict.update({'perfsim_time_ms': stats_dict['perfsim_time'] * constants.MILLI_CONST})
    TypeError: unsupported operand type(s) for *: 'NoneType' and 'float'
    TASKS TOTAL=1, NUM_RUNNING=0: 100%|██████████████████████████████████████████████████████████████████████████████████| 1/1 [00:23<00:00,  2.04s/it, postfix={'RUNNING': [], 'COMPLETED': ['cl-6170']}]
    WARNING: Benchmark - completed: 0/1
    TASKS TOTAL=1, NUM_RUNNING=0: 100%|██████████████████████████████████████████████████████████████████████████████████| 1/1 [00:24<00:00, 24.14s/it, postfix={'RUNNING': [], 'COMPLETED': ['cl-6170']}]
    INFO: packaging artifacts to /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250402-172810/regnet_x_800mf/compilation/AM62A/pkg please wait...
    SUCCESS:20250402-173508: finished packaging - /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250402-172810/regnet_x_800mf/compilation/AM62A/work/cl-6170
    Compiled model is at: /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250402-172810/regnet_x_800mf/compilation/AM62A/pkg/20250402-172810_regnet_x_800mf_onnxrt_AM62A.tar.gz
    
    WARNING: ModelMaker - Compilation completed with errors.
    
    -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
    
    Error from the log file created is below
    
    
    /home/user/.pyenv/versions/py310/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:118: UserWarning: Specified provider 'TIDLCompilationProvider' is not in available provider names.Available providers: 'AzureExecutionProvider, CPUExecutionProvider'
      warnings.warn(
    *************** EP Error ***************
    EP Error Unknown Provider Type: TIDLCompilationProvider when using ['TIDLCompilationProvider', 'CPUExecutionProvider']
    Falling back to ['CPUExecutionProvider'] and retrying.

    Help me to resolve the issue.

    Best Regards,
    Sajan

  • >>>EP Error Unknown Provider Type: TIDLCompilationProvider when using ['TIDLCompilationProvider', 'CPUExecutionProvider']
    >>>Falling back to ['CPUExecutionProvider'] and retrying.

    This happens when TIDL_TOOLS_PATH environement variable is not set correctly or when the tidl_tools are not downloaded correctly

  • Hello Manu,

    TIDL_TOOLS_PATH environement variable is not set correctly

    Can you please share the path I should export.
    believe that tidl_tools downloaded correctly. Still how to check it?

    In run_modelmaker.sh file.

    export TIDL_TOOLS_PATH="../edgeai-benchmark/tools/tidl_tools_package/${TARGET_SOC}/tidl_tools"
    export LD_LIBRARY_PATH="${TIDL_TOOLS_PATH}:${LD_LIBRARY_PATH}"

    Here is the terminal print.

    Number of AVX cores detected in PC: 12
    AVX compilation speedup in PC     : 1
    Target device                     : AM62A
    PYTHONPATH                        : .:
    TIDL_TOOLS_PATH                   : ../edgeai-benchmark/tools/tidl_tools_package/AM62A/tidl_tools
    LD_LIBRARY_PATH                   : ../edgeai-benchmark/tools/tidl_tools_package/AM62A/tidl_tools:
    argv: ['./scripts/run_modelmaker.py', 'config_classification.yaml', '--target_device', 'AM62A']
    ---------------------------------------------------------------------
    INFO: ModelMaker - task_type:classification model_name:regnet_x_800mf dataset_name:DMS run_name:20250403-123626/regnet_x_800mf
    - Model: regnet_x_800mf
    - TargetDevices & Estimated Inference Times (ms): {'TDA4VM': 2.95, 'AM62A': 5.95, 'AM67A': '5.95 (with 1/2 device capability)', 'AM68A': 2.92, 'AM69A': '2.85 (with 1/4th device capability)'}
    - This model can be compiled for the above device(s).
    ---------------------------------------------------------------------
    assuming the given download_url is a valid path: /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/datasets/DMS
    INFO: ModelMaker - dataset split sizes {'train': 8799, 'val': 2253}
    INFO: ModelMaker - max_num_files is set to: [1000, 250]
    INFO: ModelMaker - dataset split sizes are limited to: {'train': 1000, 'val': 250}
    INFO: ModelMaker - dataset loading OK
    INFO: ModelMaker - run params is at: /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250403-123626/regnet_x_800mf/run.yaml
    INFO: ModelMaker - running training - for detailed info see the log file: /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250403-123626/regnet_x_800mf/training/run.log
    TASKS TOTAL=1, NUM_RUNNING=0: 100%|███████████████████████████████████████████████████████████████████████████| 1/1 [00:53<00:00, 53.28s/it, postfix={'RUNNING': [], 'COMPLETED': ['regnet_x_800mf']}]
    Trained model is at: /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250403-123626/regnet_x_800mf/training
    
    SUCCESS: ModelMaker - Training completed.
    INFO: ModelMaker - running compilation - for detailed info see the log file: /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250403-123626/regnet_x_800mf/compilation/AM62A/work/cl-6170/run.log
    
    INFO:20250403-123722: number of configs - 1
    TASKS TOTAL=1, NUM_RUNNING=1:   0%|                                                                                    | 0/1 [00:07<?, ?it/s, postfix={'RUNNING': ['cl-6170:infer'], 'COMPLETED': []}]
    ERROR:20250403-123729: model_id:cl-6170 run_import:False run_inference:True - unsupported operand type(s) for *: 'NoneType' and 'float'
    Traceback (most recent call last):
      File "/home/user/modelmaker/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/pipelines/pipeline_runner.py", line 273, in _run_pipeline
        result = cls._run_pipeline_impl(settings, pipeline_config, description)
      File "/home/user/modelmaker/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/pipelines/pipeline_runner.py", line 308, in _run_pipeline_impl
        result = accuracy_pipeline(description)
      File "/home/user/modelmaker/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/pipelines/accuracy_pipeline.py", line 79, in __call__
        param_result = self._run(description=description)
      File "/home/user/modelmaker/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/pipelines/accuracy_pipeline.py", line 128, in _run
        output_list = self._infer_frames(description)
      File "/home/user/modelmaker/edgeai-tensorlab/edgeai-benchmark/edgeai_benchmark/pipelines/accuracy_pipeline.py", line 250, in _infer_frames
        self.infer_stats_dict.update({'perfsim_time_ms': stats_dict['perfsim_time'] * constants.MILLI_CONST})
    TypeError: unsupported operand type(s) for *: 'NoneType' and 'float'
    TASKS TOTAL=1, NUM_RUNNING=0: 100%|██████████████████████████████████████████████████████████████████████████████████| 1/1 [00:09<00:00,  2.00s/it, postfix={'RUNNING': [], 'COMPLETED': ['cl-6170']}]
    WARNING: Benchmark - completed: 0/1
    TASKS TOTAL=1, NUM_RUNNING=0: 100%|██████████████████████████████████████████████████████████████████████████████████| 1/1 [00:09<00:00,  9.38s/it, postfix={'RUNNING': [], 'COMPLETED': ['cl-6170']}]
    INFO: packaging artifacts to /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250403-123626/regnet_x_800mf/compilation/AM62A/pkg please wait...
    SUCCESS:20250403-123732: finished packaging - /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250403-123626/regnet_x_800mf/compilation/AM62A/work/cl-6170
    Compiled model is at: /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250403-123626/regnet_x_800mf/compilation/AM62A/pkg/20250403-123626_regnet_x_800mf_onnxrt_AM62A.tar.gz
    
    WARNING: ModelMaker - Compilation completed with errors.


    run log

    INFO:20250403-103802: starting - cl-6170
    
    INFO:20250403-103802: running - cl-6170
    
    INFO:20250403-103802: pipeline_config - {'task_type': 'classification', 'dataset_category': 'imagenet', 'calibration_dataset': <edgeai_benchmark.datasets.modelmaker_datasets.ModelMakerClassificationDataset object at 0x7ceecba5ebc0>, 'input_dataset': <edgeai_benchmark.datasets.modelmaker_datasets.ModelMakerClassificationDataset object at 0x7ceecbe1ffd0>, 'postprocess': <edgeai_benchmark.postprocess.PostProcessTransforms object at 0x7cee60251c90>, 'preprocess': <edgeai_benchmark.preprocess.PreProcessTransforms object at 0x7cee5fb68bb0>, 'session': <edgeai_benchmark.sessions.onnxrt_session.ONNXRTSession object at 0x7cee5fb68c40>, 'model_info': {'metric_reference': {'accuracy_top1%': None}, 'model_shortlist': 20, 'compact_name': 'regNetX-800mf-tv', 'shortlisted': True}, 'metric': {'label_offset_pred': 1}}
    
    INFO:20250403-103802: import  - cl-6170 - this may take some time...
    INFO:20250403-103802: model_path - /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250403-103708/regnet_x_800mf/training/model.onnx
    INFO:20250403-103802: model_file - /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250403-103708/regnet_x_800mf/compilation/AM62A/work/cl-6170/model/model.onnx
    INFO:20250403-103802: quant_file - /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250403-103708/regnet_x_800mf/compilation/AM62A/work/cl-6170/model/model_qparams.prototxt
    Downloading 1/1: /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250403-103708/regnet_x_800mf/training/model.onnx
    Download done for /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250403-103708/regnet_x_800mf/training/model.onnx
    Downloading 1/1: /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250403-103708/regnet_x_800mf/training/model.onnx
    Download done for /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250403-103708/regnet_x_800mf/training/model.onnx
    Converted model is valid!
    /home/user/.pyenv/versions/py310/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:118: UserWarning: Specified provider 'TIDLCompilationProvider' is not in available provider names.Available providers: 'AzureExecutionProvider, CPUExecutionProvider'
      warnings.warn(
    *************** EP Error ***************
    EP Error Unknown Provider Type: TIDLCompilationProvider when using ['TIDLCompilationProvider', 'CPUExecutionProvider']
    Falling back to ['CPUExecutionProvider'] and retrying.
    ****************************************
    
    INFO:20250403-103803: import completed  - cl-6170 - 0 sec
    
    
    SUCCESS:20250403-103803: benchmark results - {}
    
    
    INFO:20250403-103805: starting - cl-6170
    
    INFO:20250403-103805: running - cl-6170
    
    INFO:20250403-103805: pipeline_config - {'task_type': 'classification', 'dataset_category': 'imagenet', 'calibration_dataset': <edgeai_benchmark.datasets.modelmaker_datasets.ModelMakerClassificationDataset object at 0x7ceecba5ebc0>, 'input_dataset': <edgeai_benchmark.datasets.modelmaker_datasets.ModelMakerClassificationDataset object at 0x7ceecbe1ffd0>, 'postprocess': <edgeai_benchmark.postprocess.PostProcessTransforms object at 0x7cee60251c90>, 'preprocess': <edgeai_benchmark.preprocess.PreProcessTransforms object at 0x7cee5fb68bb0>, 'session': <edgeai_benchmark.sessions.onnxrt_session.ONNXRTSession object at 0x7cee5fb68c40>, 'model_info': {'metric_reference': {'accuracy_top1%': None}, 'model_shortlist': 20, 'compact_name': 'regNetX-800mf-tv', 'shortlisted': True}, 'metric': {'label_offset_pred': 1}}
    
    INFO:20250403-103805: infer  - cl-6170 - this may take some time...
    INFO:20250403-103805: model_path - /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250403-103708/regnet_x_800mf/training/model.onnx
    INFO:20250403-103805: model_file - /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250403-103708/regnet_x_800mf/compilation/AM62A/work/cl-6170/model/model.onnx
    INFO:20250403-103805: quant_file - /home/user/modelmaker/edgeai-tensorlab/edgeai-modelmaker/data/projects/DMS/run/20250403-103708/regnet_x_800mf/compilation/AM62A/work/cl-6170/model/model_qparams.prototxt
    /home/user/.pyenv/versions/py310/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:118: UserWarning: Specified provider 'TIDLExecutionProvider' is not in available provider names.Available providers: 'AzureExecutionProvider, CPUExecutionProvider'
      warnings.warn(
    *************** EP Error ***************
    EP Error Unknown Provider Type: TIDLExecutionProvider when using ['TIDLExecutionProvider', 'CPUExecutionProvider']
    Falling back to ['CPUExecutionProvider'] and retrying.
    ****************************************
    
    infer : cl-6170                                             |   0%|          || 0/250 [00:00<?, ?it/s]
    infer : cl-6170                                             |          |     0% 0/250| [< ]
    infer : cl-6170                                             | 100%|██████████|| 250/250 [00:04<00:00, 56.06it/s]
    
    WARNING: ModelMaker - Compilation completed with errors.

    Best Regards,
    Sajan