Tool/software:
Hello Engineer,
I am trying to do compile the trained model using pytorch, after the training generated a 6.1 MB binary .onnx file. I got the resources for compile the model. But didn't get a clarity from that.
Steps I did:
1. Splitted the dataset into train and val.
2. Wrote a code for training that the dataset will fed into it.
3. Contains the code for export the .pth to .onnx
4. cloned edgeai-tidl-tools
5. checkout the branch 10_01_04_00 ( Using SDK 10.01)
6. export SOC=am62a
7. source ./setup.sh
8. opened a new terminal tab the $ source ./setup_env.sh $am62a
9. created a directory named model added the .onnx file there
10. created a dictionary in examples/osrt_python/model_configs.py
"cl-ort-my_model": create_model_config( task_type="classification", source=dict( model_path="../../models/my_model.onnx", ), preprocess=dict( resize=256, crop=224, data_layout="NCHW", resize_with_pad=False, reverse_channels=False, ), session=dict( session_name="onnxrt", model_path=os.path.join(models_base_path, "my_model.onnx"), input_mean=[0.0, 0.0, 0.0], input_scale=[1.0, 1.0, 1.0], input_optimization=True, input_details={"input": [1, 3, 224, 224]}, output_details={"output": [1, 6]}, ), extra_info=dict( num_images=numImages, num_classes=6, ) ),
and models_base_path = '../../../models/'
11. python3 onnxrt_ep.py -c -m cl-ort-dms_model
Error is below:
Available execution providers : ['AzureExecutionProvider', 'CPUExecutionProvider'] Running 1 Models - ['cl-ort-my_model'] Running_Model : cl-ort-my_model Running shape inference on model ../../../models/my_model.onnx /home/user/.local/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:121: UserWarning: Specified provider 'TIDLCompilationProvider' is not in available provider names.Available providers: 'AzureExecutionProvider, CPUExecutionProvider' warnings.warn( *************** EP Error *************** EP Error Unknown Provider Type: TIDLCompilationProvider when using ['TIDLCompilationProvider', 'CPUExecutionProvider'] Falling back to ['CPUExecutionProvider'] and retrying. **************************************** Process Process-1: Traceback (most recent call last): File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap self.run() File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run self._target(*self._args, **self._kwargs) File "/home/user/edgeai-tidl-tools/examples/osrt_python/ort/onnxrt_ep.py", line 388, in run_model for j in range(batch): TypeError: 'str' object cannot be interpreted as an integer
Can you please check and verify the steps I followed. Help me to fix the error. Anything else I need to modify to compile the model
Warm Regards,
Sajan