input_zero_point: INT8 input_scale: FLOAT conv1.weight_zero_point: INT8 conv1.weight_scale: FLOAT conv1.weight_quantized: INT8 /relu/Relu_output_0_zero_point: INT8 /relu/Relu_output_0_scale: FLOAT conv2.weight_zero_point: INT8 conv2.weight_scale: FLOAT conv2.weight_quantized: INT8 /relu_1/Relu_output_0_zero_point: INT8 /relu_1/Relu_output_0_scale: FLOAT /gap/GlobalAveragePool_output_0_zero_point: INT8 /gap/GlobalAveragePool_output_0_scale: FLOAT fc1.weight_zero_point: INT8 fc1.weight_scale: FLOAT fc1.weight_quantized: INT8 /relu_2/Relu_output_0_zero_point: INT8 /relu_2/Relu_output_0_scale: FLOAT /fc2/Gemm_output_0_zero_point: INT8 /fc2/Gemm_output_0_scale: FLOAT fc2.weight_zero_point: INT8 fc2.weight_scale: FLOAT fc2.weight_quantized: INT8 output_zero_point: INT8 output_scale: FLOAT conv1.bias_quantized: INT32 conv1.bias_quantized_scale: FLOAT conv1.bias_quantized_zero_point: INT32 conv2.bias_quantized: INT32 conv2.bias_quantized_scale: FLOAT conv2.bias_quantized_zero_point: INT32 fc1.bias_quantized: INT32 fc1.bias_quantized_scale: FLOAT fc1.bias_quantized_zero_point: INT32 fc2.bias_quantized: INT32 fc2.bias_quantized_scale: FLOAT fc2.bias_quantized_zero_point: INT32