This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TDA4VM: Edge-AI : Instance Segmentation Custom

Part Number: TDA4VM

***** WARNING : tensor_bits = 32 -- Compiling for floating point - target execution is not supported for 32 bit compilation !! ***** 
tidl_tools_path                                 = /home/root/tidl_tools 
artifacts_folder                                = ../../../model-artifacts//yolact/ 
tidl_tensor_bits                                = 32 
debug_level                                     = 4 
num_tidl_subgraphs                              = 16 
tidl_denylist                                   = 
tidl_denylist_layer_name                        = 
tidl_denylist_layer_type                         = 
tidl_allowlist_layer_name                        = 
model_type                                      =  
tidl_calibration_accuracy_level                 = 64 
tidl_calibration_options:num_frames_calibration = 2 
tidl_calibration_options:bias_calibration_iterations = 3 
mixed_precision_factor = -1.000000 
model_group_id = 0 
power_of_2_quantization                         = 2 
ONNX QDQ Enabled                                = 0 
enable_high_resolution_optimization             = 0 
pre_batchnorm_fold                              = 1 
add_data_convert_ops                            = 0 
output_feature_16bit_names_list                 =  
m_params_16bit_names_list                       =  
m_single_core_layers_names_list                    =  
reserved_compile_constraints_flag               = 1601 
ti_internal_reserved_1                          = 


 ****** WARNING : Network not identified as Object Detection network : (1) Ignore if network is not Object Detection network (2) If network is Object Detection network, please specify "model_type":"OD" as part of OSRT compilation options******

Supported TIDL layer type ---            Conv -- Conv_0 
Supported TIDL layer type ---            Relu -- Relu_1 
Supported TIDL layer type ---         MaxPool -- MaxPool_2 
Supported TIDL layer type ---            Conv -- Conv_8 
Supported TIDL layer type ---            Conv -- Conv_3 
Supported TIDL layer type ---            Relu -- Relu_4 
Supported TIDL layer type ---            Conv -- Conv_5 
Supported TIDL layer type ---            Relu -- Relu_6 
Supported TIDL layer type ---            Conv -- Conv_7 
Supported TIDL layer type ---             Add -- Add_9 
Supported TIDL layer type ---            Relu -- Relu_10 
Supported TIDL layer type ---            Conv -- Conv_11 
Supported TIDL layer type ---            Relu -- Relu_12 
Supported TIDL layer type ---            Conv -- Conv_13 
Supported TIDL layer type ---            Relu -- Relu_14 
Supported TIDL layer type ---            Conv -- Conv_15 
Supported TIDL layer type ---             Add -- Add_16 
Supported TIDL layer type ---            Relu -- Relu_17 
Supported TIDL layer type ---            Conv -- Conv_18 
Supported TIDL layer type ---            Relu -- Relu_19 
Supported TIDL layer type ---            Conv -- Conv_20 
Supported TIDL layer type ---            Relu -- Relu_21 
Supported TIDL layer type ---            Conv -- Conv_22 
Supported TIDL layer type ---             Add -- Add_23 
Supported TIDL layer type ---            Relu -- Relu_24 
Supported TIDL layer type ---            Conv -- Conv_30 
Supported TIDL layer type ---            Conv -- Conv_25 
Supported TIDL layer type ---            Relu -- Relu_26 
Supported TIDL layer type ---            Conv -- Conv_27 
Supported TIDL layer type ---            Relu -- Relu_28 
Supported TIDL layer type ---            Conv -- Conv_29 
Supported TIDL layer type ---             Add -- Add_31 
Supported TIDL layer type ---            Relu -- Relu_32 
Supported TIDL layer type ---            Conv -- Conv_33 
Supported TIDL layer type ---            Relu -- Relu_34 
Supported TIDL layer type ---            Conv -- Conv_35 
Supported TIDL layer type ---            Relu -- Relu_36 
Supported TIDL layer type ---            Conv -- Conv_37 
Supported TIDL layer type ---             Add -- Add_38 
Supported TIDL layer type ---            Relu -- Relu_39 
Supported TIDL layer type ---            Conv -- Conv_40 
Supported TIDL layer type ---            Relu -- Relu_41 
Supported TIDL layer type ---            Conv -- Conv_42 
Supported TIDL layer type ---            Relu -- Relu_43 
Supported TIDL layer type ---            Conv -- Conv_44 
Supported TIDL layer type ---             Add -- Add_45 
Supported TIDL layer type ---            Relu -- Relu_46 
Supported TIDL layer type ---            Conv -- Conv_47 
Supported TIDL layer type ---            Relu -- Relu_48 
Supported TIDL layer type ---            Conv -- Conv_49 
Supported TIDL layer type ---            Relu -- Relu_50 
Supported TIDL layer type ---            Conv -- Conv_51 
Supported TIDL layer type ---             Add -- Add_52 
Supported TIDL layer type ---            Relu -- Relu_53 
Supported TIDL layer type ---            Conv -- Conv_59 
Supported TIDL layer type ---            Conv -- Conv_54 
Supported TIDL layer type ---            Relu -- Relu_55 
Supported TIDL layer type ---            Conv -- Conv_56 
Supported TIDL layer type ---            Relu -- Relu_57 
Supported TIDL layer type ---            Conv -- Conv_58 
Supported TIDL layer type ---             Add -- Add_60 
Supported TIDL layer type ---            Relu -- Relu_61 
Supported TIDL layer type ---            Conv -- Conv_62 
Supported TIDL layer type ---            Relu -- Relu_63 
Supported TIDL layer type ---            Conv -- Conv_64 
Supported TIDL layer type ---            Relu -- Relu_65 
Supported TIDL layer type ---            Conv -- Conv_66 
Supported TIDL layer type ---             Add -- Add_67 
Supported TIDL layer type ---            Relu -- Relu_68 
Supported TIDL layer type ---            Conv -- Conv_69 
Supported TIDL layer type ---            Relu -- Relu_70 
Supported TIDL layer type ---            Conv -- Conv_71 
Supported TIDL layer type ---            Relu -- Relu_72 
Supported TIDL layer type ---            Conv -- Conv_73 
Supported TIDL layer type ---             Add -- Add_74 
Supported TIDL layer type ---            Relu -- Relu_75 
Supported TIDL layer type ---            Conv -- Conv_76 
Supported TIDL layer type ---            Relu -- Relu_77 
Supported TIDL layer type ---            Conv -- Conv_78 
Supported TIDL layer type ---            Relu -- Relu_79 
Supported TIDL layer type ---            Conv -- Conv_80 
Supported TIDL layer type ---             Add -- Add_81 
Supported TIDL layer type ---            Relu -- Relu_82 
Supported TIDL layer type ---            Conv -- Conv_83 
Supported TIDL layer type ---            Relu -- Relu_84 
Supported TIDL layer type ---            Conv -- Conv_85 
Supported TIDL layer type ---            Relu -- Relu_86 
Supported TIDL layer type ---            Conv -- Conv_87 
Supported TIDL layer type ---             Add -- Add_88 
Supported TIDL layer type ---            Relu -- Relu_89 
Supported TIDL layer type ---            Conv -- Conv_90 
Supported TIDL layer type ---            Relu -- Relu_91 
Supported TIDL layer type ---            Conv -- Conv_92 
Supported TIDL layer type ---            Relu -- Relu_93 
Supported TIDL layer type ---            Conv -- Conv_94 
Supported TIDL layer type ---             Add -- Add_95 
Supported TIDL layer type ---            Relu -- Relu_96 
Supported TIDL layer type ---            Conv -- Conv_102 
Supported TIDL layer type ---            Conv -- Conv_97 
Supported TIDL layer type ---            Relu -- Relu_98 
Supported TIDL layer type ---            Conv -- Conv_99 
Supported TIDL layer type ---            Relu -- Relu_100 
Supported TIDL layer type ---            Conv -- Conv_101 
Supported TIDL layer type ---             Add -- Add_103 
Supported TIDL layer type ---            Relu -- Relu_104 
Supported TIDL layer type ---            Conv -- Conv_105 
Supported TIDL layer type ---            Relu -- Relu_106 
Supported TIDL layer type ---            Conv -- Conv_107 
Supported TIDL layer type ---            Relu -- Relu_108 
Supported TIDL layer type ---            Conv -- Conv_109 
Supported TIDL layer type ---             Add -- Add_110 
Supported TIDL layer type ---            Relu -- Relu_111 
Supported TIDL layer type ---            Conv -- Conv_112 
Supported TIDL layer type ---            Relu -- Relu_113 
Supported TIDL layer type ---            Conv -- Conv_114 
Supported TIDL layer type ---            Relu -- Relu_115 
Supported TIDL layer type ---            Conv -- Conv_116 
Supported TIDL layer type ---             Add -- Add_117 
Supported TIDL layer type ---            Relu -- Relu_118 
Supported TIDL layer type ---            Conv -- Conv_119 
Supported TIDL layer type ---            Conv -- Conv_144 
Supported TIDL layer type ---            Relu -- Relu_145 
Supported TIDL layer type ---            Conv -- Conv_150 
Supported TIDL layer type ---            Conv -- Conv_151 
Supported TIDL layer type ---            Conv -- Conv_274 
Supported TIDL layer type ---            Relu -- Relu_275 
Supported TIDL layer type ---            Conv -- Conv_284 
Supported TIDL layer type ---       Transpose -- Transpose_285 
Supported TIDL layer type ---         Reshape -- Reshape_291 
Supported TIDL layer type ---            Conv -- Conv_247 
Supported TIDL layer type ---            Relu -- Relu_248 
Supported TIDL layer type ---            Conv -- Conv_257 
Supported TIDL layer type ---       Transpose -- Transpose_258 
Supported TIDL layer type ---         Reshape -- Reshape_264 
Supported TIDL layer type ---            Conv -- Conv_220 
Supported TIDL layer type ---            Relu -- Relu_221 
Supported TIDL layer type ---            Conv -- Conv_230 
Supported TIDL layer type ---       Transpose -- Transpose_231 
Supported TIDL layer type ---         Reshape -- Reshape_237 
Supported TIDL layer type ---            Conv -- Conv_131 
Supported TIDL layer type ---          Resize -- Resize_130 
Supported TIDL layer type ---             Add -- Add_132 
Supported TIDL layer type ---            Conv -- Conv_146 
Supported TIDL layer type ---            Relu -- Relu_147 
Supported TIDL layer type ---            Conv -- Conv_193 
Supported TIDL layer type ---            Relu -- Relu_194 
Supported TIDL layer type ---            Conv -- Conv_203 
Supported TIDL layer type ---       Transpose -- Transpose_204 
Supported TIDL layer type ---         Reshape -- Reshape_210 
Supported TIDL layer type ---            Conv -- Conv_142 
Supported TIDL layer type ---          Resize -- Resize_141 
Supported TIDL layer type ---             Add -- Add_143 
Supported TIDL layer type ---            Conv -- Conv_148 
Supported TIDL layer type ---            Relu -- Relu_149 
Supported TIDL layer type ---            Conv -- Conv_166 
Supported TIDL layer type ---            Relu -- Relu_167 
Supported TIDL layer type ---            Conv -- Conv_176 
Supported TIDL layer type ---       Transpose -- Transpose_177 
Supported TIDL layer type ---         Reshape -- Reshape_183 
Supported TIDL layer type ---          Concat -- Concat_302 
Supported TIDL layer type ---         Softmax -- Softmax_305 
Supported TIDL layer type ---            Conv -- Conv_292 
Supported TIDL layer type ---       Transpose -- Transpose_293 
Supported TIDL layer type ---         Reshape -- Reshape_299 
Supported TIDL layer type ---            Tanh -- Tanh_300 
Supported TIDL layer type ---            Conv -- Conv_265 
Supported TIDL layer type ---       Transpose -- Transpose_266 
Supported TIDL layer type ---         Reshape -- Reshape_272 
Supported TIDL layer type ---            Tanh -- Tanh_273 
Supported TIDL layer type ---            Conv -- Conv_238 
Supported TIDL layer type ---       Transpose -- Transpose_239 
Supported TIDL layer type ---         Reshape -- Reshape_245 
Supported TIDL layer type ---            Tanh -- Tanh_246 
Supported TIDL layer type ---            Conv -- Conv_211 
Supported TIDL layer type ---       Transpose -- Transpose_212 
Supported TIDL layer type ---         Reshape -- Reshape_218 
Supported TIDL layer type ---            Tanh -- Tanh_219 
Supported TIDL layer type ---            Conv -- Conv_184 
Supported TIDL layer type ---       Transpose -- Transpose_185 
Supported TIDL layer type ---         Reshape -- Reshape_191 
Supported TIDL layer type ---            Tanh -- Tanh_192 
Supported TIDL layer type ---          Concat -- Concat_303 
Supported TIDL layer type ---            Conv -- Conv_276 
Supported TIDL layer type ---       Transpose -- Transpose_277 
Supported TIDL layer type ---         Reshape -- Reshape_283 
Supported TIDL layer type ---            Conv -- Conv_249 
Supported TIDL layer type ---       Transpose -- Transpose_250 
Supported TIDL layer type ---         Reshape -- Reshape_256 
Supported TIDL layer type ---            Conv -- Conv_222 
Supported TIDL layer type ---       Transpose -- Transpose_223 
Supported TIDL layer type ---         Reshape -- Reshape_229 
Supported TIDL layer type ---            Conv -- Conv_195 
Supported TIDL layer type ---       Transpose -- Transpose_196 
Supported TIDL layer type ---         Reshape -- Reshape_202 
Supported TIDL layer type ---            Conv -- Conv_168 
Supported TIDL layer type ---       Transpose -- Transpose_169 
Supported TIDL layer type ---         Reshape -- Reshape_175 
Supported TIDL layer type ---          Concat -- Concat_301 
Supported TIDL layer type ---            Conv -- Conv_152 
Supported TIDL layer type ---            Relu -- Relu_153 
Supported TIDL layer type ---            Conv -- Conv_154 
Supported TIDL layer type ---            Relu -- Relu_155 
Supported TIDL layer type ---            Conv -- Conv_156 
Supported TIDL layer type ---            Relu -- Relu_157 
Supported TIDL layer type ---          Resize -- Resize_159 
Supported TIDL layer type ---            Relu -- Relu_160 
Supported TIDL layer type ---            Conv -- Conv_161 
Supported TIDL layer type ---            Relu -- Relu_162 
Supported TIDL layer type ---            Conv -- Conv_163 
Supported TIDL layer type ---            Relu -- Relu_164 
Supported TIDL layer type ---       Transpose -- Transpose_165 

Preliminary subgraphs created = 1 
Final number of subgraphs created are : 1, - Offloaded Nodes - 211, Total Nodes - 211 
SUGGESTION -- [TIDL_ResizeLayer]  Resize Layer with non-symmetric resize ratio across width and height is not optimal.  
SUGGESTION -- [TIDL_ResizeLayer]  Resize Layer with non-symmetric resize ratio across width and height is not optimal.  
INFORMATION -- [TIDL_ResizeLayer]  Any resize ratio which is power of 2 and greater than 4 will be placed by combination of 4x4 resize layer and 2x2 resize layer. For example a 8x8 resize will be replaced by 4x4 resize followed by 2x2 resize.  
Running runtimes graphviz - /home/root/tidl_tools/tidl_graphVisualiser_runtimes.out ../../../model-artifacts//yolact//allowedNode.txt ../../../model-artifacts//yolact//tempDir/graphvizInfo.txt ../../../model-artifacts//yolact//tempDir/runtimes_visualization.svg 
*** In TIDL_createStateImportFunc *** 
Compute on node : TIDLExecutionProvider_TIDL_0_0
  0,            Conv, 3, 1, input.1, 781
  1,            Relu, 1, 1, 781, 357
  2,         MaxPool, 1, 1, 357, 358
  3,            Conv, 3, 1, 358, 784
  4,            Relu, 1, 1, 784, 361
  5,            Conv, 3, 1, 361, 787
  6,            Relu, 1, 1, 787, 364
  7,            Conv, 3, 1, 364, 790
  8,            Conv, 3, 1, 358, 793
  9,             Add, 2, 1, 790, 369
 10,            Relu, 1, 1, 369, 370
 11,            Conv, 3, 1, 370, 796
 12,            Relu, 1, 1, 796, 373
 13,            Conv, 3, 1, 373, 799
 14,            Relu, 1, 1, 799, 376
 15,            Conv, 3, 1, 376, 802
 16,             Add, 2, 1, 802, 379
 17,            Relu, 1, 1, 379, 380
 18,            Conv, 3, 1, 380, 805
 19,            Relu, 1, 1, 805, 383
 20,            Conv, 3, 1, 383, 808
 21,            Relu, 1, 1, 808, 386
 22,            Conv, 3, 1, 386, 811
 23,             Add, 2, 1, 811, 389
 24,            Relu, 1, 1, 389, 390
 25,            Conv, 3, 1, 390, 814
 26,            Relu, 1, 1, 814, 393
 27,            Conv, 3, 1, 393, 817
 28,            Relu, 1, 1, 817, 396
 29,            Conv, 3, 1, 396, 820
 30,            Conv, 3, 1, 390, 823
 31,             Add, 2, 1, 820, 401
 32,            Relu, 1, 1, 401, 402
 33,            Conv, 3, 1, 402, 826
 34,            Relu, 1, 1, 826, 405
 35,            Conv, 3, 1, 405, 829
 36,            Relu, 1, 1, 829, 408
 37,            Conv, 3, 1, 408, 832
 38,             Add, 2, 1, 832, 411
 39,            Relu, 1, 1, 411, 412
 40,            Conv, 3, 1, 412, 835
 41,            Relu, 1, 1, 835, 415
 42,            Conv, 3, 1, 415, 838
 43,            Relu, 1, 1, 838, 418
 44,            Conv, 3, 1, 418, 841
 45,             Add, 2, 1, 841, 421
 46,            Relu, 1, 1, 421, 422
 47,            Conv, 3, 1, 422, 844
 48,            Relu, 1, 1, 844, 425
 49,            Conv, 3, 1, 425, 847
 50,            Relu, 1, 1, 847, 428
 51,            Conv, 3, 1, 428, 850
 52,             Add, 2, 1, 850, 431
 53,            Relu, 1, 1, 431, 432
 54,            Conv, 3, 1, 432, 853
 55,            Relu, 1, 1, 853, 435
 56,            Conv, 3, 1, 435, 856
 57,            Relu, 1, 1, 856, 438
 58,            Conv, 3, 1, 438, 859
 59,            Conv, 3, 1, 432, 862
 60,             Add, 2, 1, 859, 443
 61,            Relu, 1, 1, 443, 444
 62,            Conv, 3, 1, 444, 865
 63,            Relu, 1, 1, 865, 447
 64,            Conv, 3, 1, 447, 868
 65,            Relu, 1, 1, 868, 450
 66,            Conv, 3, 1, 450, 871
 67,             Add, 2, 1, 871, 453
 68,            Relu, 1, 1, 453, 454
 69,            Conv, 3, 1, 454, 874
 70,            Relu, 1, 1, 874, 457
 71,            Conv, 3, 1, 457, 877
 72,            Relu, 1, 1, 877, 460
 73,            Conv, 3, 1, 460, 880
 74,             Add, 2, 1, 880, 463
 75,            Relu, 1, 1, 463, 464
 76,            Conv, 3, 1, 464, 883
 77,            Relu, 1, 1, 883, 467
 78,            Conv, 3, 1, 467, 886
 79,            Relu, 1, 1, 886, 470
 80,            Conv, 3, 1, 470, 889
 81,             Add, 2, 1, 889, 473
 82,            Relu, 1, 1, 473, 474
 83,            Conv, 3, 1, 474, 892
 84,            Relu, 1, 1, 892, 477
 85,            Conv, 3, 1, 477, 895
 86,            Relu, 1, 1, 895, 480
 87,            Conv, 3, 1, 480, 898
 88,             Add, 2, 1, 898, 483
 89,            Relu, 1, 1, 483, 484
 90,            Conv, 3, 1, 484, 901
 91,            Relu, 1, 1, 901, 487
 92,            Conv, 3, 1, 487, 904
 93,            Relu, 1, 1, 904, 490
 94,            Conv, 3, 1, 490, 907
 95,             Add, 2, 1, 907, 493
 96,            Relu, 1, 1, 493, 494
 97,            Conv, 3, 1, 494, 910
 98,            Relu, 1, 1, 910, 497
 99,            Conv, 3, 1, 497, 913
100,            Relu, 1, 1, 913, 500
101,            Conv, 3, 1, 500, 916
102,            Conv, 3, 1, 494, 919
103,             Add, 2, 1, 916, 505
104,            Relu, 1, 1, 505, 506
105,            Conv, 3, 1, 506, 922
106,            Relu, 1, 1, 922, 509
107,            Conv, 3, 1, 509, 925
108,            Relu, 1, 1, 925, 512
109,            Conv, 3, 1, 512, 928
110,             Add, 2, 1, 928, 515
111,            Relu, 1, 1, 515, 516
112,            Conv, 3, 1, 516, 931
113,            Relu, 1, 1, 931, 519
114,            Conv, 3, 1, 519, 934
115,            Relu, 1, 1, 934, 522
116,            Conv, 3, 1, 522, 937
117,             Add, 2, 1, 937, 525
118,            Relu, 1, 1, 525, 526
119,            Conv, 3, 1, 526, 527
120,          Resize, 4, 1, 527, 540
121,            Conv, 3, 1, 494, 541
122,             Add, 2, 1, 540, 542
123,          Resize, 4, 1, 542, 553
124,            Conv, 3, 1, 432, 554
125,             Add, 2, 1, 553, 555
126,            Conv, 3, 1, 555, 560
127,            Relu, 1, 1, 560, 561
128,            Conv, 3, 1, 561, 564
129,            Relu, 1, 1, 564, 565
130,            Conv, 3, 1, 565, 566
131,            Relu, 1, 1, 566, 567
132,            Conv, 3, 1, 567, 568
133,            Relu, 1, 1, 568, 569
134,          Resize, 3, 1, 569, 574
135,            Relu, 1, 1, 574, 575
136,            Conv, 3, 1, 575, 576
137,            Relu, 1, 1, 576, 577
138,            Conv, 3, 1, 577, 578
139,            Relu, 1, 1, 578, 579
140,       Transpose, 1, 1, 579, 580
141,            Conv, 3, 1, 561, 581
142,            Relu, 1, 1, 581, 582
143,            Conv, 3, 1, 582, 583
144,       Transpose, 1, 1, 583, 584
145,         Reshape, 2, 1, 584, 594
146,            Conv, 3, 1, 542, 558
147,            Relu, 1, 1, 558, 559
148,            Conv, 3, 1, 559, 620
149,            Relu, 1, 1, 620, 621
150,            Conv, 3, 1, 621, 622
151,       Transpose, 1, 1, 622, 623
152,         Reshape, 2, 1, 623, 633
153,            Conv, 3, 1, 527, 556
154,            Relu, 1, 1, 556, 557
155,            Conv, 3, 1, 557, 659
156,            Relu, 1, 1, 659, 660
157,            Conv, 3, 1, 660, 661
158,       Transpose, 1, 1, 661, 662
159,         Reshape, 2, 1, 662, 672
160,            Conv, 3, 1, 557, 562
161,            Conv, 3, 1, 562, 698
162,            Relu, 1, 1, 698, 699
163,            Conv, 3, 1, 699, 700
164,       Transpose, 1, 1, 700, 701
165,         Reshape, 2, 1, 701, 711
166,            Conv, 3, 1, 562, 563
167,            Conv, 3, 1, 563, 737
168,            Relu, 1, 1, 737, 738
169,            Conv, 3, 1, 738, 739
170,       Transpose, 1, 1, 739, 740
171,         Reshape, 2, 1, 740, 750
172,          Concat, 5, 1, 594, 776
173,            Conv, 3, 1, 582, 607
174,       Transpose, 1, 1, 607, 608
175,         Reshape, 2, 1, 608, 618
176,            Tanh, 1, 1, 618, 619
177,            Conv, 3, 1, 621, 646
178,       Transpose, 1, 1, 646, 647
179,         Reshape, 2, 1, 647, 657
180,            Tanh, 1, 1, 657, 658
181,            Conv, 3, 1, 660, 685
182,       Transpose, 1, 1, 685, 686
183,         Reshape, 2, 1, 686, 696
184,            Tanh, 1, 1, 696, 697
185,            Conv, 3, 1, 699, 724
186,       Transpose, 1, 1, 724, 725
187,         Reshape, 2, 1, 725, 735
188,            Tanh, 1, 1, 735, 736
189,            Conv, 3, 1, 738, 763
190,       Transpose, 1, 1, 763, 764
191,         Reshape, 2, 1, 764, 774
192,            Tanh, 1, 1, 774, 775
193,          Concat, 5, 1, 619, 778
194,            Conv, 3, 1, 582, 595
195,       Transpose, 1, 1, 595, 596
196,         Reshape, 2, 1, 596, 606
197,            Conv, 3, 1, 621, 634
198,       Transpose, 1, 1, 634, 635
199,         Reshape, 2, 1, 635, 645
200,            Conv, 3, 1, 660, 673
201,       Transpose, 1, 1, 673, 674
202,         Reshape, 2, 1, 674, 684
203,            Conv, 3, 1, 699, 712
204,       Transpose, 1, 1, 712, 713
205,         Reshape, 2, 1, 713, 723
206,            Conv, 3, 1, 738, 751
207,       Transpose, 1, 1, 751, 752
208,         Reshape, 2, 1, 752, 762
209,          Concat, 5, 1, 606, 777
210,         Softmax, 1, 1, 777, 780

Input tensor name -  input.1 
Output tensor name - 780 
Output tensor name - 778 
Output tensor name - 776 
Output tensor name - 580 
 Graph Domain TO version : 11In TIDL_onnxRtImportInit subgraph_name=subgraph_0
Layer 0, subgraph id subgraph_0, name=780
Layer 1, subgraph id subgraph_0, name=778
Layer 2, subgraph id subgraph_0, name=776
Layer 3, subgraph id subgraph_0, name=580
Layer 4, subgraph id subgraph_0, name=input.1
In TIDL_runtimesOptimizeNet: LayerIndex = 216, dataIndex = 212 

 ************** Frame index 1 : Running float import ************* 
In TIDL_runtimesPostProcessNet 
In TIDL_runtimesPostProcessNet 1
In TIDL_runtimesPostProcessNet 2
In TIDL_runtimesPostProcessNet 3
****************************************************
**   All the Tensor Dimensions has to be greater then Zero 
**   DIM Error - For Tensor 72, Dim 4 is 0
****************************************************
SUGGESTION: [TIDL_ResizeLayer] Resize_130 Resize Layer with non-symmetric resize ratio across width and height is not optimal.
SUGGESTION: [TIDL_ResizeLayer] Resize_141 Resize Layer with non-symmetric resize ratio across width and height is not optimal.
INFORMATION: [TIDL_ResizeLayer] Resize_159 Any resize ratio which is power of 2 and greater than 4 will be placed by combination of 4x4 resize layer and 2x2 resize layer. For example a 8x8 resize will be replaced by 4x4 resize followed by 2x2 resize.
****************************************************
**          3 WARNINGS          0 ERRORS          **
****************************************************
In TIDL_runtimesPostProcessNet 4
************ in TIDL_subgraphRtMultiple GPUs detected! Turning off JIT.
Available execution providers :  ['TIDLExecutionProvider', 'TIDLCompilationProvider', 'CPUExecutionProvider']

Running 1 Models - ['yolact']


Running_Model :  yolact  


Running shape inference on model /home/root/zzz/model_onnx/yolact.onnx 


***************Running_Inference Section **********

This is Lucid Model for image ../../../test_data/airshow.jpg
(1, 1, 1, 1, 19248, 4)
(1, 1, 1, 1, 19248, 81)
(1, 1, 1, 1, 19248, 32)
(19248, 4)
()
<zzz.layers.functions.detection_tidl.Detect object at 0x7054515375b0>
Create ************ 
 The soft limit is 2048
The hard limit is 2048
MEM: Init ... !!!
MEM: Init ... Done !!!
 0.0s:  VX_ZONE_INIT:Enabled
 0.9s:  VX_ZONE_ERROR:Enabled
 0.11s:  VX_ZONE_WARNING:Enabled
 0.4466s:  VX_ZONE_INIT:[tivxInit:185] Initialization Done !!!
************ TIDL_subgraphRtCreate done ************ 
 Warning : Couldn't find corresponding ioBuf tensor for onnx tensor with matching name 

**********  Frame Index 1 : Running float inference **********
************ in TIDL_subgraphRtDelete ************ 
 MEM: Deinit ... !!!
MEM: Alloc's: 29 alloc's of 820636057 bytes 
MEM: Free's : 29 free's  of 820636057 bytes 
MEM: Open's : 0 allocs  of 0 bytes 
MEM: Deinit ... Done !!!

Tool/software:

yolact_edgeai_info.zip

Setting:

EdgeAI SDK : 09_02_07_00

AM68PA/J721E (TDA4VM)

Hello Champs , 

I am trying to run a custom model on the TI platform, but the output results are not as expected. The model is getting executed in the delegate mode and able to get the results but with compilation model the output dimension is completely wrong, I have shared both the logs, model and the output.

Request your support to resolve this issue.

  • Hi

    with compilation model the output dimension is completely wrong

    Please try the model compilation on latest sdk 9.2.9.0 tidl tools 

    additional set export TIDL_RT_ONNX_VARDIM=1and then do model inference.

  • Hello Pratik

    Thanks for your response

    Still the issue persists with the above provided solution

    Please find the attached logs

    root@ff2ad4988eca:/home/root/examples/osrt_python/ort# python3 onnxrt_yolact.py -c
    Multiple GPUs detected! Turning off JIT.
    Available execution providers :  ['TIDLExecutionProvider', 'TIDLCompilationProvider', 'CPUExecutionProvider']
    
    Running 1 Models - ['yolact']
    
    
    Running_Model :  yolact  
    
    
    Running shape inference on model /home/root/zzz/model_onnx/yolact_org.onnx 
    
    
    ***** WARNING : tensor_bits = 32 -- Compiling for floating point - target execution is not supported for 32 bit compilation !! ***** 
    
    Preliminary subgraphs created = 1 
    Final number of subgraphs created are : 1, - Offloaded Nodes - 211, Total Nodes - 211 
    
    ***************Running_Inference Section **********
    
    This is Lucid Model for image ../../../test_data/airshow.jpg
     Graph Domain TO version : 11
     ************** Frame index 1 : Running float import ************* 
    ****************************************************
    **   All the Tensor Dimensions has to be greater then Zero 
    **   DIM Error - For Tensor 72, Dim 4 is 0
    ****************************************************
    SUGGESTION: [TIDL_ResizeLayer] Resize_130 Resize Layer with non-symmetric resize ratio across width and height is not optimal.
    SUGGESTION: [TIDL_ResizeLayer] Resize_141 Resize Layer with non-symmetric resize ratio across width and height is not optimal.
    INFORMATION: [TIDL_ResizeLayer] Resize_159 Any resize ratio which is power of 2 and greater than 4 will be placed by combination of 4x4 resize layer and 2x2 resize layer. For example a 8x8 resize will be replaced by 4x4 resize followed by 2x2 resize.
    ****************************************************
    **          3 WARNINGS          0 ERRORS          **
    ****************************************************
    The soft limit is 2048
    The hard limit is 2048
    MEM: Init ... !!!
    MEM: Init ... Done !!!
     0.0s:  VX_ZONE_INIT:Enabled
     0.8s:  VX_ZONE_ERROR:Enabled
     0.9s:  VX_ZONE_WARNING:Enabled
     0.2233s:  VX_ZONE_INIT:[tivxInit:190] Initialization Done !!!
    Warning : Couldn't find corresponding ioBuf tensor for onnx tensor with matching name 
    
    **********  Frame Index 1 : Running float inference **********
    5
    (1, 1, 1, 1, 19248, 4)
    (1, 1, 1, 1, 19248, 81)
    (1, 1, 1, 1, 19248, 32)
    (19248, 4)
    ()
    
    ***************Running_Benchmark_Section **********
    
    
    ***************Running_Inference Section **********
    
    This is Lucid Model for image ../../../test_data/ADE_val_00001801.jpg
     Graph Domain TO version : 11Warning : Couldn't find corresponding ioBuf tensor for onnx tensor with matching name 
    
     ************ Frame index 2 : Running float inference **************** 
    5
    (1, 1, 1, 1, 19248, 4)
    (1, 1, 1, 1, 19248, 81)
    (1, 1, 1, 1, 19248, 32)
    (19248, 4)
    ()
    
    ***************Running_Benchmark_Section **********
    
    
     
    Completed_Model :     1, Name : yolact                                            , Total time : 7585417308788.64, Offload Time : 67270367807.26 , DDR RW MBs : 0, Output File : py_out_yolact_ADE_val_00001801.jpg 
     
     
    MEM: Deinit ... !!!
    MEM: Alloc's: 28 alloc's of 718949141 bytes 
    MEM: Free's : 28 free's  of 718949141 bytes 
    MEM: Open's : 0 allocs  of 0 bytes 
    MEM: Deinit ... Done !!!
    
    

    yolact_logs_v9_2_9_0.zip

  • I plan to validate the same at my end, will get back to you on this by EOW.

  • Hi Venu,

    Seems like these are correct observations.

    As i see, output node from onnx graph (580) has below listed dimension specifications 

  • Hello Pratik,

    Any update on the issue ?
    Can you reshare the jira link I'm not able to access

  • Hi,

    I have updated the JIRA thread for to check for an update, once i get the response i will get back to you.

    Thank You

  • Hello Pratik,

    Thanks for your update.

    Is is possible to get the ETA for this issue ?

    Reason : Need to update the clients on the same.

    Thanks in advance

  • Venu,

    I understand the concerns, i have raised internal query with dev team to understand the timeline for this issue, am yet to get response from them.

    Will keep you posted.

    Thanks