Logging output to training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/train-log_20191106_16-37.txt training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/initial {'type':'SGD','base_lr':1e-3,'max_iter':120000,'lr_policy':'poly','power':4.0,'stepvalue':[30000,45000,300000],'weight_decay':0.0005} {'config_name':'training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/initial','model_name':'ssdJacintoNetV2','dataset':'ti-custom-cfg1','gpus':'0','train_data':'/home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_trainval_lmdb','test_data':'/home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_test_lmdb','name_size_file':'/home/liuyuyuan/caffe-jacinto/data/helmet_detection/test_name_size.txt','label_map_file':'/home/liuyuyuan/caffe-jacinto/data/helmet_detection/labelmap.prototxt','num_test_image':24,'num_classes':2,'min_ratio':10,'max_ratio':90, 'log_space_steps':2,'use_difficult_gt':1,'ignore_difficult_gt':0,'evaluate_difficult_gt':0,'pretrain_model':'training/imagenet_jacintonet11v2_iter_320000.caffemodel','use_image_list':0,'shuffle':0,'num_output':8,'resize_width':768,'resize_height':320,'crop_width':768,'crop_height':320,'batch_size':4,'aspect_ratios_type':1,'ssd_size':'512x512','small_objs':1,'min_dim':368,'concat_reg_head':0,'fully_conv_at_end':0,'first_hd_same_op_ch':1,'ker_mbox_loc_conf':1,'base_nw_3_head':0,'reg_head_at_ds8':0,'ds_fac':32,'ds_type':'PSP','rhead_name_non_linear':0,'force_color':0,'num_intermediate':512,'use_batchnorm_mbox':0,'chop_num_heads':1} caffe_root = : /home/liuyuyuan/caffe-jacinto/build/tools/caffe.bin config_param.ds_fac : 32 config_param.stride_list : [2, 2, 2, 2, 2] num_gpus: 1 gpulist: ['0'] min_dim = 368 ratio_step_size: 26 minsizes = [14.72, 36.8, 132.48, 228.16, 323.84] maxsizes = [36.8, 132.48, 228.16, 323.84, 419.52] ARs: [[2], [2, 3], [2, 3], [2], [2]] Chopping heads minsizes = [14.72, 36.8, 132.48, 228.16] maxsizes = [36.8, 132.48, 228.16, 323.84] aspect_ratios = [[2], [2, 3], [2, 3], [2]] ['ctx_output1/relu', 'ctx_output2/relu', 'ctx_output3/relu', 'ctx_output4/relu'] training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/initial/train.prototxt training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/l1reg {'type':'SGD','base_lr':1e-3,'max_iter':120000,'lr_policy':'poly','power':4.0,'stepvalue':[60000,9000,300000],'regularization_type':'L1','weight_decay':1e-5} {'config_name':'training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/l1reg','model_name':'ssdJacintoNetV2','dataset':'ti-custom-cfg1','gpus':'0','train_data':'/home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_trainval_lmdb','test_data':'/home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_test_lmdb','name_size_file':'/home/liuyuyuan/caffe-jacinto/data/helmet_detection/test_name_size.txt','label_map_file':'/home/liuyuyuan/caffe-jacinto/data/helmet_detection/labelmap.prototxt','num_test_image':24,'num_classes':2,'min_ratio':10,'max_ratio':90, 'log_space_steps':2,'use_difficult_gt':1,'ignore_difficult_gt':0,'evaluate_difficult_gt':0,'pretrain_model':'training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/initial/ti-custom-cfg1_ssdJacintoNetV2_iter_120000.caffemodel','use_image_list':0,'shuffle':0,'num_output':8,'resize_width':768,'resize_height':320,'crop_width':768,'crop_height':320,'batch_size':4,'aspect_ratios_type':1,'ssd_size':'512x512','small_objs':1,'min_dim':368,'concat_reg_head':0, 'fully_conv_at_end':0,'first_hd_same_op_ch':1,'ker_mbox_loc_conf':1,'base_nw_3_head':0,'reg_head_at_ds8':0,'ds_fac':32,'ds_type':'PSP','rhead_name_non_linear':0,'force_color':0,'num_intermediate':512,'use_batchnorm_mbox':0,'chop_num_heads':1} caffe_root = : /home/liuyuyuan/caffe-jacinto/build/tools/caffe.bin config_param.ds_fac : 32 config_param.stride_list : [2, 2, 2, 2, 2] num_gpus: 1 gpulist: ['0'] min_dim = 368 ratio_step_size: 26 minsizes = [14.72, 36.8, 132.48, 228.16, 323.84] maxsizes = [36.8, 132.48, 228.16, 323.84, 419.52] ARs: [[2], [2, 3], [2, 3], [2], [2]] Chopping heads minsizes = [14.72, 36.8, 132.48, 228.16] maxsizes = [36.8, 132.48, 228.16, 323.84] aspect_ratios = [[2], [2, 3], [2, 3], [2]] ['ctx_output1/relu', 'ctx_output2/relu', 'ctx_output3/relu', 'ctx_output4/relu'] training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/l1reg/train.prototxt training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/sparse {'type':'SGD','base_lr':1e-3,'max_iter':120000,'lr_policy':'poly','power':4.0,'stepvalue':[30000,45000,300000],'regularization_type':'L1','weight_decay':1e-5,'sparse_mode':1,'display_sparsity':2000,'sparsity_target':0.70,'sparsity_start_iter':0,'sparsity_start_factor':0.5,'sparsity_step_iter':2000,'sparsity_step_factor':0.05,'sparsity_itr_increment_bfr_applying':1,'sparsity_threshold_maxratio':0.2,'sparsity_threshold_value_max':0.2} {'config_name':'training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/sparse','model_name':'ssdJacintoNetV2','dataset':'ti-custom-cfg1','gpus':'0','train_data':'/home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_trainval_lmdb','test_data':'/home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_test_lmdb','name_size_file':'/home/liuyuyuan/caffe-jacinto/data/helmet_detection/test_name_size.txt','label_map_file':'/home/liuyuyuan/caffe-jacinto/data/helmet_detection/labelmap.prototxt','num_test_image':24,'num_classes':2,'min_ratio':10,'max_ratio':90, 'log_space_steps':2,'use_difficult_gt':1,'ignore_difficult_gt':0,'evaluate_difficult_gt':0,'pretrain_model':'training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/l1reg/ti-custom-cfg1_ssdJacintoNetV2_iter_120000.caffemodel','use_image_list':0,'shuffle':0,'num_output':8,'resize_width':768,'resize_height':320,'crop_width':768,'crop_height':320,'batch_size':1,'aspect_ratios_type':1,'ssd_size':'512x512','small_objs':1,'min_dim':368,'concat_reg_head':0, 'fully_conv_at_end':0,'first_hd_same_op_ch':1,'ker_mbox_loc_conf':1,'base_nw_3_head':0,'reg_head_at_ds8':0,'ds_fac':32,'ds_type':'PSP','rhead_name_non_linear':0,'force_color':0,'num_intermediate':512,'use_batchnorm_mbox':0, 'chop_num_heads':1} caffe_root = : /home/liuyuyuan/caffe-jacinto/build/tools/caffe.bin config_param.ds_fac : 32 config_param.stride_list : [2, 2, 2, 2, 2] num_gpus: 1 gpulist: ['0'] min_dim = 368 ratio_step_size: 26 minsizes = [14.72, 36.8, 132.48, 228.16, 323.84] maxsizes = [36.8, 132.48, 228.16, 323.84, 419.52] ARs: [[2], [2, 3], [2, 3], [2], [2]] Chopping heads minsizes = [14.72, 36.8, 132.48, 228.16] maxsizes = [36.8, 132.48, 228.16, 323.84] aspect_ratios = [[2], [2, 3], [2, 3], [2]] ['ctx_output1/relu', 'ctx_output2/relu', 'ctx_output3/relu', 'ctx_output4/relu'] training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/sparse/train.prototxt training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/test {'type':'SGD','base_lr':1e-3,'max_iter':120000,'lr_policy':'poly','power':4.0,'stepvalue':[30000,45000,300000],'regularization_type':'L1','weight_decay':1e-5,'sparse_mode':1,'display_sparsity':1000} {'config_name':'training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/test','model_name':'ssdJacintoNetV2','dataset':'ti-custom-cfg1','gpus':'0','train_data':'/home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_trainval_lmdb','test_data':'/home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_test_lmdb','name_size_file':'/home/liuyuyuan/caffe-jacinto/data/helmet_detection/test_name_size.txt','label_map_file':'/home/liuyuyuan/caffe-jacinto/data/helmet_detection/labelmap.prototxt','num_test_image':24,'num_classes':2,'min_ratio':10,'max_ratio':90, 'log_space_steps':2,'use_difficult_gt':1,'ignore_difficult_gt':0,'evaluate_difficult_gt':0,'pretrain_model':'training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/sparse/ti-custom-cfg1_ssdJacintoNetV2_iter_120000.caffemodel','use_image_list':0,'shuffle':0,'num_output':8,'resize_width':768,'resize_height':320,'crop_width':768,'crop_height':320,'batch_size':1,'test_batch_size':10,'caffe_cmd':'test_detection','display_sparsity':1,'aspect_ratios_type':1,'ssd_size':'512x512','small_objs':1,'min_dim':368,'concat_reg_head':0, 'fully_conv_at_end':0,'first_hd_same_op_ch':1,'ker_mbox_loc_conf':1,'base_nw_3_head':0,'reg_head_at_ds8':0,'ds_fac':32,'ds_type':'PSP','rhead_name_non_linear':0,'force_color':0,'num_intermediate':512,'use_batchnorm_mbox':0,'chop_num_heads':1} caffe_root = : /home/liuyuyuan/caffe-jacinto/build/tools/caffe.bin config_param.ds_fac : 32 config_param.stride_list : [2, 2, 2, 2, 2] num_gpus: 1 gpulist: ['0'] min_dim = 368 ratio_step_size: 26 minsizes = [14.72, 36.8, 132.48, 228.16, 323.84] maxsizes = [36.8, 132.48, 228.16, 323.84, 419.52] ARs: [[2], [2, 3], [2, 3], [2], [2]] Chopping heads minsizes = [14.72, 36.8, 132.48, 228.16] maxsizes = [36.8, 132.48, 228.16, 323.84] aspect_ratios = [[2], [2, 3], [2, 3], [2]] ['ctx_output1/relu', 'ctx_output2/relu', 'ctx_output3/relu', 'ctx_output4/relu'] training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/test/train.prototxt training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/test_quantize {'type':'SGD','base_lr':1e-3,'max_iter':120000,'lr_policy':'poly','power':4.0,'stepvalue':[30000,45000,300000],'regularization_type':'L1','weight_decay':1e-5,'sparse_mode':1,'display_sparsity':1000} {'config_name':'training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/test_quantize','model_name':'ssdJacintoNetV2','dataset':'ti-custom-cfg1','gpus':'0','train_data':'/home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_trainval_lmdb','test_data':'/home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_test_lmdb','name_size_file':'/home/liuyuyuan/caffe-jacinto/data/helmet_detection/test_name_size.txt','label_map_file':'/home/liuyuyuan/caffe-jacinto/data/helmet_detection/labelmap.prototxt','num_test_image':24,'num_classes':2,'min_ratio':10,'max_ratio':90, 'log_space_steps':2,'use_difficult_gt':1,'ignore_difficult_gt':0,'evaluate_difficult_gt':0,'pretrain_model':'training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/sparse/ti-custom-cfg1_ssdJacintoNetV2_iter_120000.caffemodel','use_image_list':0,'shuffle':0,'num_output':8,'resize_width':768,'resize_height':320,'crop_width':768,'crop_height':320,'batch_size':1,'test_batch_size':10,'caffe_cmd':'test_detection','aspect_ratios_type':1,'ssd_size':'512x512','small_objs':1,'min_dim':368,'concat_reg_head':0, 'fully_conv_at_end':0,'first_hd_same_op_ch':1,'ker_mbox_loc_conf':1,'base_nw_3_head':0,'reg_head_at_ds8':0,'ds_fac':32,'ds_type':'PSP','rhead_name_non_linear':0,'force_color':0,'num_intermediate':512,'use_batchnorm_mbox':0,'chop_num_heads':1} caffe_root = : /home/liuyuyuan/caffe-jacinto/build/tools/caffe.bin config_param.ds_fac : 32 config_param.stride_list : [2, 2, 2, 2, 2] num_gpus: 1 gpulist: ['0'] min_dim = 368 ratio_step_size: 26 minsizes = [14.72, 36.8, 132.48, 228.16, 323.84] maxsizes = [36.8, 132.48, 228.16, 323.84, 419.52] ARs: [[2], [2, 3], [2, 3], [2], [2]] Chopping heads minsizes = [14.72, 36.8, 132.48, 228.16] maxsizes = [36.8, 132.48, 228.16, 323.84] aspect_ratios = [[2], [2, 3], [2, 3], [2]] ['ctx_output1/relu', 'ctx_output2/relu', 'ctx_output3/relu', 'ctx_output4/relu'] training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/test_quantize/train.prototxt I1106 16:38:01.751933 13504 caffe.cpp:902] This is NVCaffe 0.17.0 started at Wed Nov 6 16:38:01 2019 I1106 16:38:01.752152 13504 caffe.cpp:904] CuDNN version: 7601 I1106 16:38:01.752156 13504 caffe.cpp:905] CuBLAS version: 10201 I1106 16:38:01.752157 13504 caffe.cpp:906] CUDA version: 10010 I1106 16:38:01.752158 13504 caffe.cpp:907] CUDA driver version: 10010 I1106 16:38:01.752162 13504 caffe.cpp:908] Arguments: [0]: /home/liuyuyuan/caffe-jacinto/build/tools/caffe.bin [1]: train [2]: --solver=training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/initial/solver.prototxt [3]: --weights=training/imagenet_jacintonet11v2_iter_320000.caffemodel [4]: --gpu [5]: 0 I1106 16:38:01.805001 13504 gpu_memory.cpp:105] GPUMemory::Manager initialized I1106 16:38:01.805371 13504 gpu_memory.cpp:107] Total memory: 6193479680, Free: 3154903040, dev_info[0]: total=6193479680 free=3154903040 I1106 16:38:01.805377 13504 caffe.cpp:226] Using GPUs 0 I1106 16:38:01.805630 13504 caffe.cpp:230] GPU 0: GeForce GTX 1660 Ti I1106 16:38:01.805688 13504 solver.cpp:41] Solver data type: FLOAT I1106 16:38:01.814049 13504 solver.cpp:44] Initializing solver from parameters: train_net: "training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/initial/train.prototxt" test_net: "training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/initial/test.prototxt" test_iter: 3 test_interval: 2000 base_lr: 0.001 display: 100 max_iter: 120000 lr_policy: "poly" gamma: 0.1 power: 4 momentum: 0.9 weight_decay: 0.0005 snapshot: 2000 snapshot_prefix: "training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/initial/ti-custom-cfg1_ssdJacintoNetV2" solver_mode: GPU device_id: 0 random_seed: 33 debug_info: false train_state { level: 0 stage: "" } snapshot_after_train: true test_initialization: true average_loss: 10 stepvalue: 30000 stepvalue: 45000 stepvalue: 300000 iter_size: 8 type: "SGD" eval_type: "detection" ap_version: "11point" show_per_class_result: true I1106 16:38:01.814220 13504 solver.cpp:76] Creating training net from train_net file: training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/initial/train.prototxt I1106 16:38:01.815630 13504 net.cpp:80] Initializing net from parameters: name: "ssdJacintoNetV2" state { phase: TRAIN level: 0 stage: "" } layer { name: "data" type: "AnnotatedData" top: "data" top: "label" include { phase: TRAIN } transform_param { mirror: true mean_value: 0 mean_value: 0 mean_value: 0 force_color: false resize_param { prob: 1 resize_mode: WARP height: 320 width: 768 interp_mode: LINEAR interp_mode: AREA interp_mode: NEAREST interp_mode: CUBIC interp_mode: LANCZOS4 } emit_constraint { emit_type: CENTER } crop_h: 320 crop_w: 768 distort_param { brightness_prob: 0.5 brightness_delta: 32 contrast_prob: 0.5 contrast_lower: 0.5 contrast_upper: 1.5 hue_prob: 0.5 hue_delta: 18 saturation_prob: 0.5 saturation_lower: 0.5 saturation_upper: 1.5 random_order_prob: 0 } expand_param { prob: 0.5 max_expand_ratio: 4 } } data_param { source: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_trainval_lmdb" batch_size: 4 backend: LMDB threads: 4 parser_threads: 4 } annotated_data_param { batch_sampler { max_sample: 1 max_trials: 1 } batch_sampler { sampler { min_scale: 0.3 max_scale: 1 min_aspect_ratio: 0.5 max_aspect_ratio: 2 } sample_constraint { min_jaccard_overlap: 0.1 } max_sample: 1 max_trials: 50 } batch_sampler { sampler { min_scale: 0.3 max_scale: 1 min_aspect_ratio: 0.5 max_aspect_ratio: 2 } sample_constraint { min_jaccard_overlap: 0.3 } max_sample: 1 max_trials: 50 } batch_sampler { sampler { min_scale: 0.3 max_scale: 1 min_aspect_ratio: 0.5 max_aspect_ratio: 2 } sample_constraint { min_jaccard_overlap: 0.5 } max_sample: 1 max_trials: 50 } batch_sampler { sampler { min_scale: 0.3 max_scale: 1 min_aspect_ratio: 0.5 max_aspect_ratio: 2 } sample_constraint { min_jaccard_overlap: 0.7 } max_sample: 1 max_trials: 50 } batch_sampler { sampler { min_scale: 0.3 max_scale: 1 min_aspect_ratio: 0.5 max_aspect_ratio: 2 } sample_constraint { min_jaccard_overlap: 0.9 } max_sample: 1 max_trials: 50 } batch_sampler { sampler { min_scale: 0.3 max_scale: 1 min_aspect_ratio: 0.5 max_aspect_ratio: 2 } sample_constraint { max_jaccard_overlap: 1 } max_sample: 1 max_trials: 50 } label_map_file: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/labelmap.prototxt" } } layer { name: "data/bias" type: "Bias" bottom: "data" top: "data/bias" param { lr_mult: 0 decay_mult: 0 } bias_param { filler { type: "constant" value: -128 } } } layer { name: "conv1a" type: "Convolution" bottom: "data/bias" top: "conv1a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 32 bias_term: true pad: 2 kernel_size: 5 group: 1 stride: 2 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "conv1a/bn" type: "BatchNorm" bottom: "conv1a" top: "conv1a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "conv1a/relu" type: "ReLU" bottom: "conv1a" top: "conv1a" } layer { name: "conv1b" type: "Convolution" bottom: "conv1a" top: "conv1b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 32 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "conv1b/bn" type: "BatchNorm" bottom: "conv1b" top: "conv1b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "conv1b/relu" type: "ReLU" bottom: "conv1b" top: "conv1b" } layer { name: "pool1" type: "Pooling" bottom: "conv1b" top: "pool1" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res2a_branch2a" type: "Convolution" bottom: "pool1" top: "res2a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 64 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res2a_branch2a/bn" type: "BatchNorm" bottom: "res2a_branch2a" top: "res2a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res2a_branch2a/relu" type: "ReLU" bottom: "res2a_branch2a" top: "res2a_branch2a" } layer { name: "res2a_branch2b" type: "Convolution" bottom: "res2a_branch2a" top: "res2a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 64 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res2a_branch2b/bn" type: "BatchNorm" bottom: "res2a_branch2b" top: "res2a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res2a_branch2b/relu" type: "ReLU" bottom: "res2a_branch2b" top: "res2a_branch2b" } layer { name: "pool2" type: "Pooling" bottom: "res2a_branch2b" top: "pool2" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res3a_branch2a" type: "Convolution" bottom: "pool2" top: "res3a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 128 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res3a_branch2a/bn" type: "BatchNorm" bottom: "res3a_branch2a" top: "res3a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res3a_branch2a/relu" type: "ReLU" bottom: "res3a_branch2a" top: "res3a_branch2a" } layer { name: "res3a_branch2b" type: "Convolution" bottom: "res3a_branch2a" top: "res3a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 128 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res3a_branch2b/bn" type: "BatchNorm" bottom: "res3a_branch2b" top: "res3a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res3a_branch2b/relu" type: "ReLU" bottom: "res3a_branch2b" top: "res3a_branch2b" } layer { name: "pool3" type: "Pooling" bottom: "res3a_branch2b" top: "pool3" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res4a_branch2a" type: "Convolution" bottom: "pool3" top: "res4a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res4a_branch2a/bn" type: "BatchNorm" bottom: "res4a_branch2a" top: "res4a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res4a_branch2a/relu" type: "ReLU" bottom: "res4a_branch2a" top: "res4a_branch2a" } layer { name: "res4a_branch2b" type: "Convolution" bottom: "res4a_branch2a" top: "res4a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res4a_branch2b/bn" type: "BatchNorm" bottom: "res4a_branch2b" top: "res4a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res4a_branch2b/relu" type: "ReLU" bottom: "res4a_branch2b" top: "res4a_branch2b" } layer { name: "pool4" type: "Pooling" bottom: "res4a_branch2b" top: "pool4" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res5a_branch2a" type: "Convolution" bottom: "pool4" top: "res5a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 512 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res5a_branch2a/bn" type: "BatchNorm" bottom: "res5a_branch2a" top: "res5a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res5a_branch2a/relu" type: "ReLU" bottom: "res5a_branch2a" top: "res5a_branch2a" } layer { name: "res5a_branch2b" type: "Convolution" bottom: "res5a_branch2a" top: "res5a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 512 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res5a_branch2b/bn" type: "BatchNorm" bottom: "res5a_branch2b" top: "res5a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res5a_branch2b/relu" type: "ReLU" bottom: "res5a_branch2b" top: "res5a_branch2b" } layer { name: "pool6" type: "Pooling" bottom: "res5a_branch2b" top: "pool6" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "pool7" type: "Pooling" bottom: "pool6" top: "pool7" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "pool8" type: "Pooling" bottom: "pool7" top: "pool8" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "ctx_output1" type: "Convolution" bottom: "res4a_branch2b" top: "ctx_output1" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu" type: "ReLU" bottom: "ctx_output1" top: "ctx_output1" } layer { name: "ctx_output2" type: "Convolution" bottom: "res5a_branch2b" top: "ctx_output2" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu" type: "ReLU" bottom: "ctx_output2" top: "ctx_output2" } layer { name: "ctx_output3" type: "Convolution" bottom: "pool6" top: "ctx_output3" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu" type: "ReLU" bottom: "ctx_output3" top: "ctx_output3" } layer { name: "ctx_output4" type: "Convolution" bottom: "pool7" top: "ctx_output4" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu" type: "ReLU" bottom: "ctx_output4" top: "ctx_output4" } layer { name: "ctx_output5" type: "Convolution" bottom: "pool8" top: "ctx_output5" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output5/relu" type: "ReLU" bottom: "ctx_output5" top: "ctx_output5" } layer { name: "ctx_output1/relu_mbox_loc" type: "Convolution" bottom: "ctx_output1" top: "ctx_output1/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 16 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output1/relu_mbox_loc" top: "ctx_output1/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output1/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output1/relu_mbox_loc_perm" top: "ctx_output1/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output1/relu_mbox_conf" type: "Convolution" bottom: "ctx_output1" top: "ctx_output1/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 8 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output1/relu_mbox_conf" top: "ctx_output1/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output1/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output1/relu_mbox_conf_perm" top: "ctx_output1/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output1/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output1" bottom: "data" top: "ctx_output1/relu_mbox_priorbox" prior_box_param { min_size: 14.72 max_size: 36.8 aspect_ratio: 2 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output2/relu_mbox_loc" type: "Convolution" bottom: "ctx_output2" top: "ctx_output2/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 24 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output2/relu_mbox_loc" top: "ctx_output2/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output2/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output2/relu_mbox_loc_perm" top: "ctx_output2/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output2/relu_mbox_conf" type: "Convolution" bottom: "ctx_output2" top: "ctx_output2/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 12 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output2/relu_mbox_conf" top: "ctx_output2/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output2/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output2/relu_mbox_conf_perm" top: "ctx_output2/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output2/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output2" bottom: "data" top: "ctx_output2/relu_mbox_priorbox" prior_box_param { min_size: 36.8 max_size: 132.48 aspect_ratio: 2 aspect_ratio: 3 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output3/relu_mbox_loc" type: "Convolution" bottom: "ctx_output3" top: "ctx_output3/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 24 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output3/relu_mbox_loc" top: "ctx_output3/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output3/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output3/relu_mbox_loc_perm" top: "ctx_output3/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output3/relu_mbox_conf" type: "Convolution" bottom: "ctx_output3" top: "ctx_output3/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 12 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output3/relu_mbox_conf" top: "ctx_output3/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output3/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output3/relu_mbox_conf_perm" top: "ctx_output3/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output3/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output3" bottom: "data" top: "ctx_output3/relu_mbox_priorbox" prior_box_param { min_size: 132.48 max_size: 228.16 aspect_ratio: 2 aspect_ratio: 3 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output4/relu_mbox_loc" type: "Convolution" bottom: "ctx_output4" top: "ctx_output4/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 16 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output4/relu_mbox_loc" top: "ctx_output4/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output4/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output4/relu_mbox_loc_perm" top: "ctx_output4/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output4/relu_mbox_conf" type: "Convolution" bottom: "ctx_output4" top: "ctx_output4/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 8 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output4/relu_mbox_conf" top: "ctx_output4/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output4/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output4/relu_mbox_conf_perm" top: "ctx_output4/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output4/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output4" bottom: "data" top: "ctx_output4/relu_mbox_priorbox" prior_box_param { min_size: 228.16 max_size: 323.84 aspect_ratio: 2 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "mbox_loc" type: "Concat" bottom: "ctx_output1/relu_mbox_loc_flat" bottom: "ctx_output2/relu_mbox_loc_flat" bottom: "ctx_output3/relu_mbox_loc_flat" bottom: "ctx_output4/relu_mbox_loc_flat" top: "mbox_loc" concat_param { axis: 1 } } layer { name: "mbox_conf" type: "Concat" bottom: "ctx_output1/relu_mbox_conf_flat" bottom: "ctx_output2/relu_mbox_conf_flat" bottom: "ctx_output3/relu_mbox_conf_flat" bottom: "ctx_output4/relu_mbox_conf_flat" top: "mbox_conf" concat_param { axis: 1 } } layer { name: "mbox_priorbox" type: "Concat" bottom: "ctx_output1/relu_mbox_priorbox" bottom: "ctx_output2/relu_mbox_priorbox" bottom: "ctx_output3/relu_mbox_priorbox" bottom: "ctx_output4/relu_mbox_priorbox" top: "mbox_priorbox" concat_param { axis: 2 } } layer { name: "mbox_loss" type: "MultiBoxLoss" bottom: "mbox_loc" bottom: "mbox_conf" bottom: "mbox_priorbox" bottom: "label" top: "mbox_loss" include { phase: TRAIN } propagate_down: true propagate_down: true propagate_down: false propagate_down: false loss_param { normalization: VALID } multibox_loss_param { loc_loss_type: SMOOTH_L1 conf_loss_type: SOFTMAX loc_weight: 1 num_classes: 2 share_location: true match_type: PER_PREDICTION overlap_threshold: 0.5 use_prior_for_matching: true background_label_id: 0 use_difficult_gt: true neg_pos_ratio: 3 neg_overlap: 0.5 code_type: CENTER_SIZE ignore_cross_boundary_bbox: false mining_type: MAX_NEGATIVE ignore_difficult_gt: false } } I1106 16:38:01.816126 13504 net.cpp:110] Using FLOAT as default forward math type I1106 16:38:01.816154 13504 net.cpp:116] Using FLOAT as default backward math type I1106 16:38:01.816165 13504 layer_factory.hpp:172] Creating layer 'data' of type 'AnnotatedData' I1106 16:38:01.816174 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:01.816341 13504 internal_thread.cpp:19] Starting 1 internal thread(s) on device 0 I1106 16:38:01.816540 13504 net.cpp:200] Created Layer data (0) I1106 16:38:01.816560 13504 net.cpp:542] data -> data I1106 16:38:01.816589 13504 net.cpp:542] data -> label I1106 16:38:01.816637 13504 data_reader.cpp:58] Data Reader threads: 4, out queues: 16, depth: 4 I1106 16:38:01.816686 13504 internal_thread.cpp:19] Starting 4 internal thread(s) on device 0 I1106 16:38:01.817430 13520 db_lmdb.cpp:36] Opened lmdb /home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_trainval_lmdb I1106 16:38:01.817705 13521 db_lmdb.cpp:36] Opened lmdb /home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_trainval_lmdb I1106 16:38:01.818208 13518 db_lmdb.cpp:36] Opened lmdb /home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_trainval_lmdb I1106 16:38:01.818517 13517 blocking_queue.cpp:40] Data layer prefetch queue empty I1106 16:38:01.818857 13519 db_lmdb.cpp:36] Opened lmdb /home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_trainval_lmdb I1106 16:38:01.819653 13504 annotated_data_layer.cpp:105] output data size: 4,3,320,768 I1106 16:38:01.819926 13504 annotated_data_layer.cpp:150] [0] Output data size: 4, 3, 320, 768 I1106 16:38:01.820006 13504 internal_thread.cpp:19] Starting 4 internal thread(s) on device 0 I1106 16:38:01.820399 13522 data_layer.cpp:105] [0] Parser threads: 4 I1106 16:38:01.820405 13522 data_layer.cpp:107] [0] Transformer threads: 4 I1106 16:38:01.820652 13504 net.cpp:260] Setting up data I1106 16:38:01.820667 13504 net.cpp:267] TRAIN Top shape for layer 0 'data' 4 3 320 768 (2949120) I1106 16:38:01.820673 13504 net.cpp:267] TRAIN Top shape for layer 0 'data' 1 1 5 8 (40) I1106 16:38:01.821362 13504 layer_factory.hpp:172] Creating layer 'data_data_0_split' of type 'Split' I1106 16:38:01.821373 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:01.821388 13504 net.cpp:200] Created Layer data_data_0_split (1) I1106 16:38:01.821398 13504 net.cpp:572] data_data_0_split <- data I1106 16:38:01.821413 13504 net.cpp:542] data_data_0_split -> data_data_0_split_0 I1106 16:38:01.821419 13504 net.cpp:542] data_data_0_split -> data_data_0_split_1 I1106 16:38:01.821422 13504 net.cpp:542] data_data_0_split -> data_data_0_split_2 I1106 16:38:01.821426 13504 net.cpp:542] data_data_0_split -> data_data_0_split_3 I1106 16:38:01.821429 13504 net.cpp:542] data_data_0_split -> data_data_0_split_4 I1106 16:38:01.826004 13504 net.cpp:260] Setting up data_data_0_split I1106 16:38:01.826067 13504 net.cpp:267] TRAIN Top shape for layer 1 'data_data_0_split' 4 3 320 768 (2949120) I1106 16:38:01.826072 13504 net.cpp:267] TRAIN Top shape for layer 1 'data_data_0_split' 4 3 320 768 (2949120) I1106 16:38:01.826076 13504 net.cpp:267] TRAIN Top shape for layer 1 'data_data_0_split' 4 3 320 768 (2949120) I1106 16:38:01.826077 13504 net.cpp:267] TRAIN Top shape for layer 1 'data_data_0_split' 4 3 320 768 (2949120) I1106 16:38:01.826079 13504 net.cpp:267] TRAIN Top shape for layer 1 'data_data_0_split' 4 3 320 768 (2949120) I1106 16:38:01.826087 13504 layer_factory.hpp:172] Creating layer 'data/bias' of type 'Bias' I1106 16:38:01.826095 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:01.826125 13504 net.cpp:200] Created Layer data/bias (2) I1106 16:38:01.826131 13504 net.cpp:572] data/bias <- data_data_0_split_0 I1106 16:38:01.826143 13504 net.cpp:542] data/bias -> data/bias I1106 16:38:01.832197 13504 net.cpp:260] Setting up data/bias I1106 16:38:01.832259 13504 net.cpp:267] TRAIN Top shape for layer 2 'data/bias' 4 3 320 768 (2949120) I1106 16:38:01.832286 13504 layer_factory.hpp:172] Creating layer 'conv1a' of type 'Convolution' I1106 16:38:01.832294 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:01.832346 13504 net.cpp:200] Created Layer conv1a (3) I1106 16:38:01.832363 13504 net.cpp:572] conv1a <- data/bias I1106 16:38:01.832376 13504 net.cpp:542] conv1a -> conv1a I1106 16:38:01.937129 13518 data_reader.cpp:320] Restarting data pre-fetching I1106 16:38:03.152206 13504 net.cpp:260] Setting up conv1a I1106 16:38:03.152228 13504 net.cpp:267] TRAIN Top shape for layer 3 'conv1a' 4 32 160 384 (7864320) I1106 16:38:03.152240 13504 layer_factory.hpp:172] Creating layer 'conv1a/bn' of type 'BatchNorm' I1106 16:38:03.152245 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.152277 13504 net.cpp:200] Created Layer conv1a/bn (4) I1106 16:38:03.152281 13504 net.cpp:572] conv1a/bn <- conv1a I1106 16:38:03.152288 13504 net.cpp:527] conv1a/bn -> conv1a (in-place) I1106 16:38:03.152693 13504 net.cpp:260] Setting up conv1a/bn I1106 16:38:03.152700 13504 net.cpp:267] TRAIN Top shape for layer 4 'conv1a/bn' 4 32 160 384 (7864320) I1106 16:38:03.152709 13504 layer_factory.hpp:172] Creating layer 'conv1a/relu' of type 'ReLU' I1106 16:38:03.152710 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.152716 13504 net.cpp:200] Created Layer conv1a/relu (5) I1106 16:38:03.152719 13504 net.cpp:572] conv1a/relu <- conv1a I1106 16:38:03.152721 13504 net.cpp:527] conv1a/relu -> conv1a (in-place) I1106 16:38:03.152730 13504 net.cpp:260] Setting up conv1a/relu I1106 16:38:03.152734 13504 net.cpp:267] TRAIN Top shape for layer 5 'conv1a/relu' 4 32 160 384 (7864320) I1106 16:38:03.152737 13504 layer_factory.hpp:172] Creating layer 'conv1b' of type 'Convolution' I1106 16:38:03.152741 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.152752 13504 net.cpp:200] Created Layer conv1b (6) I1106 16:38:03.152755 13504 net.cpp:572] conv1b <- conv1a I1106 16:38:03.152787 13504 net.cpp:542] conv1b -> conv1b I1106 16:38:03.153528 13504 net.cpp:260] Setting up conv1b I1106 16:38:03.153535 13504 net.cpp:267] TRAIN Top shape for layer 6 'conv1b' 4 32 160 384 (7864320) I1106 16:38:03.153543 13504 layer_factory.hpp:172] Creating layer 'conv1b/bn' of type 'BatchNorm' I1106 16:38:03.153546 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.153553 13504 net.cpp:200] Created Layer conv1b/bn (7) I1106 16:38:03.153555 13504 net.cpp:572] conv1b/bn <- conv1b I1106 16:38:03.153558 13504 net.cpp:527] conv1b/bn -> conv1b (in-place) I1106 16:38:03.153825 13504 net.cpp:260] Setting up conv1b/bn I1106 16:38:03.153831 13504 net.cpp:267] TRAIN Top shape for layer 7 'conv1b/bn' 4 32 160 384 (7864320) I1106 16:38:03.153836 13504 layer_factory.hpp:172] Creating layer 'conv1b/relu' of type 'ReLU' I1106 16:38:03.153839 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.153846 13504 net.cpp:200] Created Layer conv1b/relu (8) I1106 16:38:03.153848 13504 net.cpp:572] conv1b/relu <- conv1b I1106 16:38:03.153851 13504 net.cpp:527] conv1b/relu -> conv1b (in-place) I1106 16:38:03.153854 13504 net.cpp:260] Setting up conv1b/relu I1106 16:38:03.153857 13504 net.cpp:267] TRAIN Top shape for layer 8 'conv1b/relu' 4 32 160 384 (7864320) I1106 16:38:03.153861 13504 layer_factory.hpp:172] Creating layer 'pool1' of type 'Pooling' I1106 16:38:03.153863 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.153887 13504 net.cpp:200] Created Layer pool1 (9) I1106 16:38:03.153890 13504 net.cpp:572] pool1 <- conv1b I1106 16:38:03.153893 13504 net.cpp:542] pool1 -> pool1 I1106 16:38:03.153934 13504 net.cpp:260] Setting up pool1 I1106 16:38:03.153939 13504 net.cpp:267] TRAIN Top shape for layer 9 'pool1' 4 32 80 192 (1966080) I1106 16:38:03.153941 13504 layer_factory.hpp:172] Creating layer 'res2a_branch2a' of type 'Convolution' I1106 16:38:03.153945 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.153952 13504 net.cpp:200] Created Layer res2a_branch2a (10) I1106 16:38:03.153955 13504 net.cpp:572] res2a_branch2a <- pool1 I1106 16:38:03.153959 13504 net.cpp:542] res2a_branch2a -> res2a_branch2a I1106 16:38:03.154814 13504 net.cpp:260] Setting up res2a_branch2a I1106 16:38:03.154824 13504 net.cpp:267] TRAIN Top shape for layer 10 'res2a_branch2a' 4 64 80 192 (3932160) I1106 16:38:03.154830 13504 layer_factory.hpp:172] Creating layer 'res2a_branch2a/bn' of type 'BatchNorm' I1106 16:38:03.154834 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.154841 13504 net.cpp:200] Created Layer res2a_branch2a/bn (11) I1106 16:38:03.154844 13504 net.cpp:572] res2a_branch2a/bn <- res2a_branch2a I1106 16:38:03.154847 13504 net.cpp:527] res2a_branch2a/bn -> res2a_branch2a (in-place) I1106 16:38:03.155083 13504 net.cpp:260] Setting up res2a_branch2a/bn I1106 16:38:03.155088 13504 net.cpp:267] TRAIN Top shape for layer 11 'res2a_branch2a/bn' 4 64 80 192 (3932160) I1106 16:38:03.155094 13504 layer_factory.hpp:172] Creating layer 'res2a_branch2a/relu' of type 'ReLU' I1106 16:38:03.155097 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.155103 13504 net.cpp:200] Created Layer res2a_branch2a/relu (12) I1106 16:38:03.155107 13504 net.cpp:572] res2a_branch2a/relu <- res2a_branch2a I1106 16:38:03.155108 13504 net.cpp:527] res2a_branch2a/relu -> res2a_branch2a (in-place) I1106 16:38:03.155112 13504 net.cpp:260] Setting up res2a_branch2a/relu I1106 16:38:03.155115 13504 net.cpp:267] TRAIN Top shape for layer 12 'res2a_branch2a/relu' 4 64 80 192 (3932160) I1106 16:38:03.155119 13504 layer_factory.hpp:172] Creating layer 'res2a_branch2b' of type 'Convolution' I1106 16:38:03.155122 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.155155 13504 net.cpp:200] Created Layer res2a_branch2b (13) I1106 16:38:03.155158 13504 net.cpp:572] res2a_branch2b <- res2a_branch2a I1106 16:38:03.155161 13504 net.cpp:542] res2a_branch2b -> res2a_branch2b I1106 16:38:03.155388 13504 net.cpp:260] Setting up res2a_branch2b I1106 16:38:03.155395 13504 net.cpp:267] TRAIN Top shape for layer 13 'res2a_branch2b' 4 64 80 192 (3932160) I1106 16:38:03.155400 13504 layer_factory.hpp:172] Creating layer 'res2a_branch2b/bn' of type 'BatchNorm' I1106 16:38:03.155402 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.155407 13504 net.cpp:200] Created Layer res2a_branch2b/bn (14) I1106 16:38:03.155411 13504 net.cpp:572] res2a_branch2b/bn <- res2a_branch2b I1106 16:38:03.155413 13504 net.cpp:527] res2a_branch2b/bn -> res2a_branch2b (in-place) I1106 16:38:03.155644 13504 net.cpp:260] Setting up res2a_branch2b/bn I1106 16:38:03.155649 13504 net.cpp:267] TRAIN Top shape for layer 14 'res2a_branch2b/bn' 4 64 80 192 (3932160) I1106 16:38:03.155671 13504 layer_factory.hpp:172] Creating layer 'res2a_branch2b/relu' of type 'ReLU' I1106 16:38:03.155673 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.155684 13504 net.cpp:200] Created Layer res2a_branch2b/relu (15) I1106 16:38:03.155688 13504 net.cpp:572] res2a_branch2b/relu <- res2a_branch2b I1106 16:38:03.155689 13504 net.cpp:527] res2a_branch2b/relu -> res2a_branch2b (in-place) I1106 16:38:03.155694 13504 net.cpp:260] Setting up res2a_branch2b/relu I1106 16:38:03.155696 13504 net.cpp:267] TRAIN Top shape for layer 15 'res2a_branch2b/relu' 4 64 80 192 (3932160) I1106 16:38:03.155699 13504 layer_factory.hpp:172] Creating layer 'pool2' of type 'Pooling' I1106 16:38:03.155701 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.155709 13504 net.cpp:200] Created Layer pool2 (16) I1106 16:38:03.155711 13504 net.cpp:572] pool2 <- res2a_branch2b I1106 16:38:03.155714 13504 net.cpp:542] pool2 -> pool2 I1106 16:38:03.155742 13504 net.cpp:260] Setting up pool2 I1106 16:38:03.155746 13504 net.cpp:267] TRAIN Top shape for layer 16 'pool2' 4 64 40 96 (983040) I1106 16:38:03.155750 13504 layer_factory.hpp:172] Creating layer 'res3a_branch2a' of type 'Convolution' I1106 16:38:03.155752 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.155761 13504 net.cpp:200] Created Layer res3a_branch2a (17) I1106 16:38:03.155764 13504 net.cpp:572] res3a_branch2a <- pool2 I1106 16:38:03.155766 13504 net.cpp:542] res3a_branch2a -> res3a_branch2a I1106 16:38:03.156401 13504 net.cpp:260] Setting up res3a_branch2a I1106 16:38:03.156407 13504 net.cpp:267] TRAIN Top shape for layer 17 'res3a_branch2a' 4 128 40 96 (1966080) I1106 16:38:03.156412 13504 layer_factory.hpp:172] Creating layer 'res3a_branch2a/bn' of type 'BatchNorm' I1106 16:38:03.156430 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.156435 13504 net.cpp:200] Created Layer res3a_branch2a/bn (18) I1106 16:38:03.156438 13504 net.cpp:572] res3a_branch2a/bn <- res3a_branch2a I1106 16:38:03.156440 13504 net.cpp:527] res3a_branch2a/bn -> res3a_branch2a (in-place) I1106 16:38:03.156636 13504 net.cpp:260] Setting up res3a_branch2a/bn I1106 16:38:03.156642 13504 net.cpp:267] TRAIN Top shape for layer 18 'res3a_branch2a/bn' 4 128 40 96 (1966080) I1106 16:38:03.156666 13504 layer_factory.hpp:172] Creating layer 'res3a_branch2a/relu' of type 'ReLU' I1106 16:38:03.156668 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.156672 13504 net.cpp:200] Created Layer res3a_branch2a/relu (19) I1106 16:38:03.156675 13504 net.cpp:572] res3a_branch2a/relu <- res3a_branch2a I1106 16:38:03.156677 13504 net.cpp:527] res3a_branch2a/relu -> res3a_branch2a (in-place) I1106 16:38:03.156682 13504 net.cpp:260] Setting up res3a_branch2a/relu I1106 16:38:03.156684 13504 net.cpp:267] TRAIN Top shape for layer 19 'res3a_branch2a/relu' 4 128 40 96 (1966080) I1106 16:38:03.156715 13504 layer_factory.hpp:172] Creating layer 'res3a_branch2b' of type 'Convolution' I1106 16:38:03.156718 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.156744 13504 net.cpp:200] Created Layer res3a_branch2b (20) I1106 16:38:03.156747 13504 net.cpp:572] res3a_branch2b <- res3a_branch2a I1106 16:38:03.156749 13504 net.cpp:542] res3a_branch2b -> res3a_branch2b I1106 16:38:03.157143 13504 net.cpp:260] Setting up res3a_branch2b I1106 16:38:03.157150 13504 net.cpp:267] TRAIN Top shape for layer 20 'res3a_branch2b' 4 128 40 96 (1966080) I1106 16:38:03.157153 13504 layer_factory.hpp:172] Creating layer 'res3a_branch2b/bn' of type 'BatchNorm' I1106 16:38:03.157172 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.157177 13504 net.cpp:200] Created Layer res3a_branch2b/bn (21) I1106 16:38:03.157181 13504 net.cpp:572] res3a_branch2b/bn <- res3a_branch2b I1106 16:38:03.157183 13504 net.cpp:527] res3a_branch2b/bn -> res3a_branch2b (in-place) I1106 16:38:03.157361 13504 net.cpp:260] Setting up res3a_branch2b/bn I1106 16:38:03.157366 13504 net.cpp:267] TRAIN Top shape for layer 21 'res3a_branch2b/bn' 4 128 40 96 (1966080) I1106 16:38:03.157371 13504 layer_factory.hpp:172] Creating layer 'res3a_branch2b/relu' of type 'ReLU' I1106 16:38:03.157374 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.157378 13504 net.cpp:200] Created Layer res3a_branch2b/relu (22) I1106 16:38:03.157380 13504 net.cpp:572] res3a_branch2b/relu <- res3a_branch2b I1106 16:38:03.157383 13504 net.cpp:527] res3a_branch2b/relu -> res3a_branch2b (in-place) I1106 16:38:03.157387 13504 net.cpp:260] Setting up res3a_branch2b/relu I1106 16:38:03.157390 13504 net.cpp:267] TRAIN Top shape for layer 22 'res3a_branch2b/relu' 4 128 40 96 (1966080) I1106 16:38:03.157393 13504 layer_factory.hpp:172] Creating layer 'pool3' of type 'Pooling' I1106 16:38:03.157395 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.157402 13504 net.cpp:200] Created Layer pool3 (23) I1106 16:38:03.157403 13504 net.cpp:572] pool3 <- res3a_branch2b I1106 16:38:03.157405 13504 net.cpp:542] pool3 -> pool3 I1106 16:38:03.157470 13504 net.cpp:260] Setting up pool3 I1106 16:38:03.157490 13504 net.cpp:267] TRAIN Top shape for layer 23 'pool3' 4 128 20 48 (491520) I1106 16:38:03.157493 13504 layer_factory.hpp:172] Creating layer 'res4a_branch2a' of type 'Convolution' I1106 16:38:03.157496 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.157519 13504 net.cpp:200] Created Layer res4a_branch2a (24) I1106 16:38:03.157523 13504 net.cpp:572] res4a_branch2a <- pool3 I1106 16:38:03.157526 13504 net.cpp:542] res4a_branch2a -> res4a_branch2a I1106 16:38:03.160281 13504 net.cpp:260] Setting up res4a_branch2a I1106 16:38:03.160290 13504 net.cpp:267] TRAIN Top shape for layer 24 'res4a_branch2a' 4 256 20 48 (983040) I1106 16:38:03.160295 13504 layer_factory.hpp:172] Creating layer 'res4a_branch2a/bn' of type 'BatchNorm' I1106 16:38:03.160313 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.160320 13504 net.cpp:200] Created Layer res4a_branch2a/bn (25) I1106 16:38:03.160323 13504 net.cpp:572] res4a_branch2a/bn <- res4a_branch2a I1106 16:38:03.160326 13504 net.cpp:527] res4a_branch2a/bn -> res4a_branch2a (in-place) I1106 16:38:03.160531 13504 net.cpp:260] Setting up res4a_branch2a/bn I1106 16:38:03.160535 13504 net.cpp:267] TRAIN Top shape for layer 25 'res4a_branch2a/bn' 4 256 20 48 (983040) I1106 16:38:03.160540 13504 layer_factory.hpp:172] Creating layer 'res4a_branch2a/relu' of type 'ReLU' I1106 16:38:03.160542 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.160545 13504 net.cpp:200] Created Layer res4a_branch2a/relu (26) I1106 16:38:03.160555 13504 net.cpp:572] res4a_branch2a/relu <- res4a_branch2a I1106 16:38:03.160559 13504 net.cpp:527] res4a_branch2a/relu -> res4a_branch2a (in-place) I1106 16:38:03.160563 13504 net.cpp:260] Setting up res4a_branch2a/relu I1106 16:38:03.160567 13504 net.cpp:267] TRAIN Top shape for layer 26 'res4a_branch2a/relu' 4 256 20 48 (983040) I1106 16:38:03.160569 13504 layer_factory.hpp:172] Creating layer 'res4a_branch2b' of type 'Convolution' I1106 16:38:03.160573 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.160581 13504 net.cpp:200] Created Layer res4a_branch2b (27) I1106 16:38:03.160584 13504 net.cpp:572] res4a_branch2b <- res4a_branch2a I1106 16:38:03.160586 13504 net.cpp:542] res4a_branch2b -> res4a_branch2b I1106 16:38:03.161784 13504 net.cpp:260] Setting up res4a_branch2b I1106 16:38:03.161790 13504 net.cpp:267] TRAIN Top shape for layer 27 'res4a_branch2b' 4 256 20 48 (983040) I1106 16:38:03.161795 13504 layer_factory.hpp:172] Creating layer 'res4a_branch2b/bn' of type 'BatchNorm' I1106 16:38:03.161798 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.161803 13504 net.cpp:200] Created Layer res4a_branch2b/bn (28) I1106 16:38:03.161806 13504 net.cpp:572] res4a_branch2b/bn <- res4a_branch2b I1106 16:38:03.161809 13504 net.cpp:527] res4a_branch2b/bn -> res4a_branch2b (in-place) I1106 16:38:03.162030 13504 net.cpp:260] Setting up res4a_branch2b/bn I1106 16:38:03.162036 13504 net.cpp:267] TRAIN Top shape for layer 28 'res4a_branch2b/bn' 4 256 20 48 (983040) I1106 16:38:03.162042 13504 layer_factory.hpp:172] Creating layer 'res4a_branch2b/relu' of type 'ReLU' I1106 16:38:03.162045 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.162050 13504 net.cpp:200] Created Layer res4a_branch2b/relu (29) I1106 16:38:03.162051 13504 net.cpp:572] res4a_branch2b/relu <- res4a_branch2b I1106 16:38:03.162055 13504 net.cpp:527] res4a_branch2b/relu -> res4a_branch2b (in-place) I1106 16:38:03.162060 13504 net.cpp:260] Setting up res4a_branch2b/relu I1106 16:38:03.162063 13504 net.cpp:267] TRAIN Top shape for layer 29 'res4a_branch2b/relu' 4 256 20 48 (983040) I1106 16:38:03.162065 13504 layer_factory.hpp:172] Creating layer 'res4a_branch2b_res4a_branch2b/relu_0_split' of type 'Split' I1106 16:38:03.162068 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.162071 13504 net.cpp:200] Created Layer res4a_branch2b_res4a_branch2b/relu_0_split (30) I1106 16:38:03.162089 13504 net.cpp:572] res4a_branch2b_res4a_branch2b/relu_0_split <- res4a_branch2b I1106 16:38:03.162092 13504 net.cpp:542] res4a_branch2b_res4a_branch2b/relu_0_split -> res4a_branch2b_res4a_branch2b/relu_0_split_0 I1106 16:38:03.162096 13504 net.cpp:542] res4a_branch2b_res4a_branch2b/relu_0_split -> res4a_branch2b_res4a_branch2b/relu_0_split_1 I1106 16:38:03.162119 13504 net.cpp:260] Setting up res4a_branch2b_res4a_branch2b/relu_0_split I1106 16:38:03.162123 13504 net.cpp:267] TRAIN Top shape for layer 30 'res4a_branch2b_res4a_branch2b/relu_0_split' 4 256 20 48 (983040) I1106 16:38:03.162127 13504 net.cpp:267] TRAIN Top shape for layer 30 'res4a_branch2b_res4a_branch2b/relu_0_split' 4 256 20 48 (983040) I1106 16:38:03.162129 13504 layer_factory.hpp:172] Creating layer 'pool4' of type 'Pooling' I1106 16:38:03.162132 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.162137 13504 net.cpp:200] Created Layer pool4 (31) I1106 16:38:03.162139 13504 net.cpp:572] pool4 <- res4a_branch2b_res4a_branch2b/relu_0_split_0 I1106 16:38:03.162142 13504 net.cpp:542] pool4 -> pool4 I1106 16:38:03.162170 13504 net.cpp:260] Setting up pool4 I1106 16:38:03.162174 13504 net.cpp:267] TRAIN Top shape for layer 31 'pool4' 4 256 10 24 (245760) I1106 16:38:03.162178 13504 layer_factory.hpp:172] Creating layer 'res5a_branch2a' of type 'Convolution' I1106 16:38:03.162201 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.162240 13504 net.cpp:200] Created Layer res5a_branch2a (32) I1106 16:38:03.162243 13504 net.cpp:572] res5a_branch2a <- pool4 I1106 16:38:03.162246 13504 net.cpp:542] res5a_branch2a -> res5a_branch2a I1106 16:38:03.172025 13504 net.cpp:260] Setting up res5a_branch2a I1106 16:38:03.172044 13504 net.cpp:267] TRAIN Top shape for layer 32 'res5a_branch2a' 4 512 10 24 (491520) I1106 16:38:03.172055 13504 layer_factory.hpp:172] Creating layer 'res5a_branch2a/bn' of type 'BatchNorm' I1106 16:38:03.172060 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.172070 13504 net.cpp:200] Created Layer res5a_branch2a/bn (33) I1106 16:38:03.172075 13504 net.cpp:572] res5a_branch2a/bn <- res5a_branch2a I1106 16:38:03.172078 13504 net.cpp:527] res5a_branch2a/bn -> res5a_branch2a (in-place) I1106 16:38:03.172286 13504 net.cpp:260] Setting up res5a_branch2a/bn I1106 16:38:03.172291 13504 net.cpp:267] TRAIN Top shape for layer 33 'res5a_branch2a/bn' 4 512 10 24 (491520) I1106 16:38:03.172297 13504 layer_factory.hpp:172] Creating layer 'res5a_branch2a/relu' of type 'ReLU' I1106 16:38:03.172299 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.172304 13504 net.cpp:200] Created Layer res5a_branch2a/relu (34) I1106 16:38:03.172307 13504 net.cpp:572] res5a_branch2a/relu <- res5a_branch2a I1106 16:38:03.172309 13504 net.cpp:527] res5a_branch2a/relu -> res5a_branch2a (in-place) I1106 16:38:03.172312 13504 net.cpp:260] Setting up res5a_branch2a/relu I1106 16:38:03.172315 13504 net.cpp:267] TRAIN Top shape for layer 34 'res5a_branch2a/relu' 4 512 10 24 (491520) I1106 16:38:03.172317 13504 layer_factory.hpp:172] Creating layer 'res5a_branch2b' of type 'Convolution' I1106 16:38:03.172319 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.172329 13504 net.cpp:200] Created Layer res5a_branch2b (35) I1106 16:38:03.172333 13504 net.cpp:572] res5a_branch2b <- res5a_branch2a I1106 16:38:03.172335 13504 net.cpp:542] res5a_branch2b -> res5a_branch2b I1106 16:38:03.177065 13504 net.cpp:260] Setting up res5a_branch2b I1106 16:38:03.177074 13504 net.cpp:267] TRAIN Top shape for layer 35 'res5a_branch2b' 4 512 10 24 (491520) I1106 16:38:03.177083 13504 layer_factory.hpp:172] Creating layer 'res5a_branch2b/bn' of type 'BatchNorm' I1106 16:38:03.177085 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.177093 13504 net.cpp:200] Created Layer res5a_branch2b/bn (36) I1106 16:38:03.177094 13504 net.cpp:572] res5a_branch2b/bn <- res5a_branch2b I1106 16:38:03.177098 13504 net.cpp:527] res5a_branch2b/bn -> res5a_branch2b (in-place) I1106 16:38:03.177301 13504 net.cpp:260] Setting up res5a_branch2b/bn I1106 16:38:03.177306 13504 net.cpp:267] TRAIN Top shape for layer 36 'res5a_branch2b/bn' 4 512 10 24 (491520) I1106 16:38:03.177327 13504 layer_factory.hpp:172] Creating layer 'res5a_branch2b/relu' of type 'ReLU' I1106 16:38:03.177330 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.177333 13504 net.cpp:200] Created Layer res5a_branch2b/relu (37) I1106 16:38:03.177336 13504 net.cpp:572] res5a_branch2b/relu <- res5a_branch2b I1106 16:38:03.177338 13504 net.cpp:527] res5a_branch2b/relu -> res5a_branch2b (in-place) I1106 16:38:03.177343 13504 net.cpp:260] Setting up res5a_branch2b/relu I1106 16:38:03.177347 13504 net.cpp:267] TRAIN Top shape for layer 37 'res5a_branch2b/relu' 4 512 10 24 (491520) I1106 16:38:03.177350 13504 layer_factory.hpp:172] Creating layer 'res5a_branch2b_res5a_branch2b/relu_0_split' of type 'Split' I1106 16:38:03.177352 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.177356 13504 net.cpp:200] Created Layer res5a_branch2b_res5a_branch2b/relu_0_split (38) I1106 16:38:03.177358 13504 net.cpp:572] res5a_branch2b_res5a_branch2b/relu_0_split <- res5a_branch2b I1106 16:38:03.177371 13504 net.cpp:542] res5a_branch2b_res5a_branch2b/relu_0_split -> res5a_branch2b_res5a_branch2b/relu_0_split_0 I1106 16:38:03.177376 13504 net.cpp:542] res5a_branch2b_res5a_branch2b/relu_0_split -> res5a_branch2b_res5a_branch2b/relu_0_split_1 I1106 16:38:03.177397 13504 net.cpp:260] Setting up res5a_branch2b_res5a_branch2b/relu_0_split I1106 16:38:03.177402 13504 net.cpp:267] TRAIN Top shape for layer 38 'res5a_branch2b_res5a_branch2b/relu_0_split' 4 512 10 24 (491520) I1106 16:38:03.177404 13504 net.cpp:267] TRAIN Top shape for layer 38 'res5a_branch2b_res5a_branch2b/relu_0_split' 4 512 10 24 (491520) I1106 16:38:03.177407 13504 layer_factory.hpp:172] Creating layer 'pool6' of type 'Pooling' I1106 16:38:03.177409 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.177415 13504 net.cpp:200] Created Layer pool6 (39) I1106 16:38:03.177417 13504 net.cpp:572] pool6 <- res5a_branch2b_res5a_branch2b/relu_0_split_0 I1106 16:38:03.177420 13504 net.cpp:542] pool6 -> pool6 I1106 16:38:03.177450 13504 net.cpp:260] Setting up pool6 I1106 16:38:03.177454 13504 net.cpp:267] TRAIN Top shape for layer 39 'pool6' 4 512 5 12 (122880) I1106 16:38:03.177458 13504 layer_factory.hpp:172] Creating layer 'pool6_pool6_0_split' of type 'Split' I1106 16:38:03.177460 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.177464 13504 net.cpp:200] Created Layer pool6_pool6_0_split (40) I1106 16:38:03.177466 13504 net.cpp:572] pool6_pool6_0_split <- pool6 I1106 16:38:03.177470 13504 net.cpp:542] pool6_pool6_0_split -> pool6_pool6_0_split_0 I1106 16:38:03.177474 13504 net.cpp:542] pool6_pool6_0_split -> pool6_pool6_0_split_1 I1106 16:38:03.177493 13504 net.cpp:260] Setting up pool6_pool6_0_split I1106 16:38:03.177497 13504 net.cpp:267] TRAIN Top shape for layer 40 'pool6_pool6_0_split' 4 512 5 12 (122880) I1106 16:38:03.177501 13504 net.cpp:267] TRAIN Top shape for layer 40 'pool6_pool6_0_split' 4 512 5 12 (122880) I1106 16:38:03.177503 13504 layer_factory.hpp:172] Creating layer 'pool7' of type 'Pooling' I1106 16:38:03.177505 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.177511 13504 net.cpp:200] Created Layer pool7 (41) I1106 16:38:03.177515 13504 net.cpp:572] pool7 <- pool6_pool6_0_split_0 I1106 16:38:03.177516 13504 net.cpp:542] pool7 -> pool7 I1106 16:38:03.177541 13504 net.cpp:260] Setting up pool7 I1106 16:38:03.177546 13504 net.cpp:267] TRAIN Top shape for layer 41 'pool7' 4 512 3 6 (36864) I1106 16:38:03.177548 13504 layer_factory.hpp:172] Creating layer 'pool7_pool7_0_split' of type 'Split' I1106 16:38:03.177551 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.177554 13504 net.cpp:200] Created Layer pool7_pool7_0_split (42) I1106 16:38:03.177556 13504 net.cpp:572] pool7_pool7_0_split <- pool7 I1106 16:38:03.177561 13504 net.cpp:542] pool7_pool7_0_split -> pool7_pool7_0_split_0 I1106 16:38:03.177563 13504 net.cpp:542] pool7_pool7_0_split -> pool7_pool7_0_split_1 I1106 16:38:03.177582 13504 net.cpp:260] Setting up pool7_pool7_0_split I1106 16:38:03.177587 13504 net.cpp:267] TRAIN Top shape for layer 42 'pool7_pool7_0_split' 4 512 3 6 (36864) I1106 16:38:03.177589 13504 net.cpp:267] TRAIN Top shape for layer 42 'pool7_pool7_0_split' 4 512 3 6 (36864) I1106 16:38:03.177592 13504 layer_factory.hpp:172] Creating layer 'pool8' of type 'Pooling' I1106 16:38:03.177593 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.177598 13504 net.cpp:200] Created Layer pool8 (43) I1106 16:38:03.177603 13504 net.cpp:572] pool8 <- pool7_pool7_0_split_0 I1106 16:38:03.177605 13504 net.cpp:542] pool8 -> pool8 I1106 16:38:03.177634 13504 net.cpp:260] Setting up pool8 I1106 16:38:03.177637 13504 net.cpp:267] TRAIN Top shape for layer 43 'pool8' 4 512 2 3 (12288) I1106 16:38:03.177640 13504 layer_factory.hpp:172] Creating layer 'ctx_output1' of type 'Convolution' I1106 16:38:03.177649 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.177659 13504 net.cpp:200] Created Layer ctx_output1 (44) I1106 16:38:03.177662 13504 net.cpp:572] ctx_output1 <- res4a_branch2b_res4a_branch2b/relu_0_split_1 I1106 16:38:03.177666 13504 net.cpp:542] ctx_output1 -> ctx_output1 I1106 16:38:03.178278 13504 net.cpp:260] Setting up ctx_output1 I1106 16:38:03.178284 13504 net.cpp:267] TRAIN Top shape for layer 44 'ctx_output1' 4 256 20 48 (983040) I1106 16:38:03.178289 13504 layer_factory.hpp:172] Creating layer 'ctx_output1/relu' of type 'ReLU' I1106 16:38:03.178292 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.178297 13504 net.cpp:200] Created Layer ctx_output1/relu (45) I1106 16:38:03.178299 13504 net.cpp:572] ctx_output1/relu <- ctx_output1 I1106 16:38:03.178303 13504 net.cpp:527] ctx_output1/relu -> ctx_output1 (in-place) I1106 16:38:03.178306 13504 net.cpp:260] Setting up ctx_output1/relu I1106 16:38:03.178309 13504 net.cpp:267] TRAIN Top shape for layer 45 'ctx_output1/relu' 4 256 20 48 (983040) I1106 16:38:03.178313 13504 layer_factory.hpp:172] Creating layer 'ctx_output1_ctx_output1/relu_0_split' of type 'Split' I1106 16:38:03.178314 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.178319 13504 net.cpp:200] Created Layer ctx_output1_ctx_output1/relu_0_split (46) I1106 16:38:03.178321 13504 net.cpp:572] ctx_output1_ctx_output1/relu_0_split <- ctx_output1 I1106 16:38:03.178324 13504 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_0 I1106 16:38:03.178328 13504 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_1 I1106 16:38:03.178331 13504 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_2 I1106 16:38:03.178364 13504 net.cpp:260] Setting up ctx_output1_ctx_output1/relu_0_split I1106 16:38:03.178369 13504 net.cpp:267] TRAIN Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 4 256 20 48 (983040) I1106 16:38:03.178371 13504 net.cpp:267] TRAIN Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 4 256 20 48 (983040) I1106 16:38:03.178375 13504 net.cpp:267] TRAIN Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 4 256 20 48 (983040) I1106 16:38:03.178376 13504 layer_factory.hpp:172] Creating layer 'ctx_output2' of type 'Convolution' I1106 16:38:03.178378 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.178387 13504 net.cpp:200] Created Layer ctx_output2 (47) I1106 16:38:03.178391 13504 net.cpp:572] ctx_output2 <- res5a_branch2b_res5a_branch2b/relu_0_split_1 I1106 16:38:03.178393 13504 net.cpp:542] ctx_output2 -> ctx_output2 I1106 16:38:03.179564 13504 net.cpp:260] Setting up ctx_output2 I1106 16:38:03.179572 13504 net.cpp:267] TRAIN Top shape for layer 47 'ctx_output2' 4 256 10 24 (245760) I1106 16:38:03.179579 13504 layer_factory.hpp:172] Creating layer 'ctx_output2/relu' of type 'ReLU' I1106 16:38:03.179581 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.179585 13504 net.cpp:200] Created Layer ctx_output2/relu (48) I1106 16:38:03.179589 13504 net.cpp:572] ctx_output2/relu <- ctx_output2 I1106 16:38:03.179591 13504 net.cpp:527] ctx_output2/relu -> ctx_output2 (in-place) I1106 16:38:03.179596 13504 net.cpp:260] Setting up ctx_output2/relu I1106 16:38:03.179600 13504 net.cpp:267] TRAIN Top shape for layer 48 'ctx_output2/relu' 4 256 10 24 (245760) I1106 16:38:03.179601 13504 layer_factory.hpp:172] Creating layer 'ctx_output2_ctx_output2/relu_0_split' of type 'Split' I1106 16:38:03.179605 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.179608 13504 net.cpp:200] Created Layer ctx_output2_ctx_output2/relu_0_split (49) I1106 16:38:03.179610 13504 net.cpp:572] ctx_output2_ctx_output2/relu_0_split <- ctx_output2 I1106 16:38:03.179621 13504 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_0 I1106 16:38:03.179626 13504 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_1 I1106 16:38:03.179631 13504 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_2 I1106 16:38:03.179661 13504 net.cpp:260] Setting up ctx_output2_ctx_output2/relu_0_split I1106 16:38:03.179666 13504 net.cpp:267] TRAIN Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 4 256 10 24 (245760) I1106 16:38:03.179670 13504 net.cpp:267] TRAIN Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 4 256 10 24 (245760) I1106 16:38:03.179673 13504 net.cpp:267] TRAIN Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 4 256 10 24 (245760) I1106 16:38:03.179677 13504 layer_factory.hpp:172] Creating layer 'ctx_output3' of type 'Convolution' I1106 16:38:03.179687 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.179699 13504 net.cpp:200] Created Layer ctx_output3 (50) I1106 16:38:03.179702 13504 net.cpp:572] ctx_output3 <- pool6_pool6_0_split_1 I1106 16:38:03.179705 13504 net.cpp:542] ctx_output3 -> ctx_output3 I1106 16:38:03.181244 13504 net.cpp:260] Setting up ctx_output3 I1106 16:38:03.181252 13504 net.cpp:267] TRAIN Top shape for layer 50 'ctx_output3' 4 256 5 12 (61440) I1106 16:38:03.181258 13504 layer_factory.hpp:172] Creating layer 'ctx_output3/relu' of type 'ReLU' I1106 16:38:03.181262 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.181265 13504 net.cpp:200] Created Layer ctx_output3/relu (51) I1106 16:38:03.181268 13504 net.cpp:572] ctx_output3/relu <- ctx_output3 I1106 16:38:03.181272 13504 net.cpp:527] ctx_output3/relu -> ctx_output3 (in-place) I1106 16:38:03.181277 13504 net.cpp:260] Setting up ctx_output3/relu I1106 16:38:03.181280 13504 net.cpp:267] TRAIN Top shape for layer 51 'ctx_output3/relu' 4 256 5 12 (61440) I1106 16:38:03.181283 13504 layer_factory.hpp:172] Creating layer 'ctx_output3_ctx_output3/relu_0_split' of type 'Split' I1106 16:38:03.181285 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.181289 13504 net.cpp:200] Created Layer ctx_output3_ctx_output3/relu_0_split (52) I1106 16:38:03.181293 13504 net.cpp:572] ctx_output3_ctx_output3/relu_0_split <- ctx_output3 I1106 16:38:03.181295 13504 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_0 I1106 16:38:03.181300 13504 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_1 I1106 16:38:03.181304 13504 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_2 I1106 16:38:03.181336 13504 net.cpp:260] Setting up ctx_output3_ctx_output3/relu_0_split I1106 16:38:03.181341 13504 net.cpp:267] TRAIN Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 4 256 5 12 (61440) I1106 16:38:03.181344 13504 net.cpp:267] TRAIN Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 4 256 5 12 (61440) I1106 16:38:03.181347 13504 net.cpp:267] TRAIN Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 4 256 5 12 (61440) I1106 16:38:03.181350 13504 layer_factory.hpp:172] Creating layer 'ctx_output4' of type 'Convolution' I1106 16:38:03.181354 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.181362 13504 net.cpp:200] Created Layer ctx_output4 (53) I1106 16:38:03.181365 13504 net.cpp:572] ctx_output4 <- pool7_pool7_0_split_1 I1106 16:38:03.181367 13504 net.cpp:542] ctx_output4 -> ctx_output4 I1106 16:38:03.182395 13504 net.cpp:260] Setting up ctx_output4 I1106 16:38:03.182401 13504 net.cpp:267] TRAIN Top shape for layer 53 'ctx_output4' 4 256 3 6 (18432) I1106 16:38:03.182406 13504 layer_factory.hpp:172] Creating layer 'ctx_output4/relu' of type 'ReLU' I1106 16:38:03.182417 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.182422 13504 net.cpp:200] Created Layer ctx_output4/relu (54) I1106 16:38:03.182425 13504 net.cpp:572] ctx_output4/relu <- ctx_output4 I1106 16:38:03.182428 13504 net.cpp:527] ctx_output4/relu -> ctx_output4 (in-place) I1106 16:38:03.182433 13504 net.cpp:260] Setting up ctx_output4/relu I1106 16:38:03.182437 13504 net.cpp:267] TRAIN Top shape for layer 54 'ctx_output4/relu' 4 256 3 6 (18432) I1106 16:38:03.182440 13504 layer_factory.hpp:172] Creating layer 'ctx_output4_ctx_output4/relu_0_split' of type 'Split' I1106 16:38:03.182442 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.182446 13504 net.cpp:200] Created Layer ctx_output4_ctx_output4/relu_0_split (55) I1106 16:38:03.182448 13504 net.cpp:572] ctx_output4_ctx_output4/relu_0_split <- ctx_output4 I1106 16:38:03.182451 13504 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_0 I1106 16:38:03.182456 13504 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_1 I1106 16:38:03.182459 13504 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_2 I1106 16:38:03.182493 13504 net.cpp:260] Setting up ctx_output4_ctx_output4/relu_0_split I1106 16:38:03.182497 13504 net.cpp:267] TRAIN Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 4 256 3 6 (18432) I1106 16:38:03.182500 13504 net.cpp:267] TRAIN Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 4 256 3 6 (18432) I1106 16:38:03.182503 13504 net.cpp:267] TRAIN Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 4 256 3 6 (18432) I1106 16:38:03.182507 13504 layer_factory.hpp:172] Creating layer 'ctx_output5' of type 'Convolution' I1106 16:38:03.182509 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.182518 13504 net.cpp:200] Created Layer ctx_output5 (56) I1106 16:38:03.182521 13504 net.cpp:572] ctx_output5 <- pool8 I1106 16:38:03.182525 13504 net.cpp:542] ctx_output5 -> ctx_output5 I1106 16:38:03.183558 13504 net.cpp:260] Setting up ctx_output5 I1106 16:38:03.183565 13504 net.cpp:267] TRAIN Top shape for layer 56 'ctx_output5' 4 256 2 3 (6144) I1106 16:38:03.183569 13504 layer_factory.hpp:172] Creating layer 'ctx_output5/relu' of type 'ReLU' I1106 16:38:03.183573 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.183575 13504 net.cpp:200] Created Layer ctx_output5/relu (57) I1106 16:38:03.183578 13504 net.cpp:572] ctx_output5/relu <- ctx_output5 I1106 16:38:03.183580 13504 net.cpp:527] ctx_output5/relu -> ctx_output5 (in-place) I1106 16:38:03.183585 13504 net.cpp:260] Setting up ctx_output5/relu I1106 16:38:03.183588 13504 net.cpp:267] TRAIN Top shape for layer 57 'ctx_output5/relu' 4 256 2 3 (6144) I1106 16:38:03.183590 13504 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc' of type 'Convolution' I1106 16:38:03.183593 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.183601 13504 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc (58) I1106 16:38:03.183605 13504 net.cpp:572] ctx_output1/relu_mbox_loc <- ctx_output1_ctx_output1/relu_0_split_0 I1106 16:38:03.183609 13504 net.cpp:542] ctx_output1/relu_mbox_loc -> ctx_output1/relu_mbox_loc I1106 16:38:03.183801 13504 net.cpp:260] Setting up ctx_output1/relu_mbox_loc I1106 16:38:03.183807 13504 net.cpp:267] TRAIN Top shape for layer 58 'ctx_output1/relu_mbox_loc' 4 16 20 48 (61440) I1106 16:38:03.183811 13504 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:03.183815 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.183823 13504 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc_perm (59) I1106 16:38:03.183826 13504 net.cpp:572] ctx_output1/relu_mbox_loc_perm <- ctx_output1/relu_mbox_loc I1106 16:38:03.183851 13504 net.cpp:542] ctx_output1/relu_mbox_loc_perm -> ctx_output1/relu_mbox_loc_perm I1106 16:38:03.183929 13504 net.cpp:260] Setting up ctx_output1/relu_mbox_loc_perm I1106 16:38:03.183933 13504 net.cpp:267] TRAIN Top shape for layer 59 'ctx_output1/relu_mbox_loc_perm' 4 20 48 16 (61440) I1106 16:38:03.183936 13504 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:03.183938 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.183944 13504 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc_flat (60) I1106 16:38:03.183948 13504 net.cpp:572] ctx_output1/relu_mbox_loc_flat <- ctx_output1/relu_mbox_loc_perm I1106 16:38:03.183950 13504 net.cpp:542] ctx_output1/relu_mbox_loc_flat -> ctx_output1/relu_mbox_loc_flat I1106 16:38:03.184056 13504 net.cpp:260] Setting up ctx_output1/relu_mbox_loc_flat I1106 16:38:03.184060 13504 net.cpp:267] TRAIN Top shape for layer 60 'ctx_output1/relu_mbox_loc_flat' 4 15360 (61440) I1106 16:38:03.184075 13504 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf' of type 'Convolution' I1106 16:38:03.184078 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.184088 13504 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf (61) I1106 16:38:03.184092 13504 net.cpp:572] ctx_output1/relu_mbox_conf <- ctx_output1_ctx_output1/relu_0_split_1 I1106 16:38:03.184095 13504 net.cpp:542] ctx_output1/relu_mbox_conf -> ctx_output1/relu_mbox_conf I1106 16:38:03.184249 13504 net.cpp:260] Setting up ctx_output1/relu_mbox_conf I1106 16:38:03.184255 13504 net.cpp:267] TRAIN Top shape for layer 61 'ctx_output1/relu_mbox_conf' 4 8 20 48 (30720) I1106 16:38:03.184259 13504 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:03.184262 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.184269 13504 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf_perm (62) I1106 16:38:03.184273 13504 net.cpp:572] ctx_output1/relu_mbox_conf_perm <- ctx_output1/relu_mbox_conf I1106 16:38:03.184275 13504 net.cpp:542] ctx_output1/relu_mbox_conf_perm -> ctx_output1/relu_mbox_conf_perm I1106 16:38:03.184330 13504 net.cpp:260] Setting up ctx_output1/relu_mbox_conf_perm I1106 16:38:03.184335 13504 net.cpp:267] TRAIN Top shape for layer 62 'ctx_output1/relu_mbox_conf_perm' 4 20 48 8 (30720) I1106 16:38:03.184340 13504 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:03.184341 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.184345 13504 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf_flat (63) I1106 16:38:03.184348 13504 net.cpp:572] ctx_output1/relu_mbox_conf_flat <- ctx_output1/relu_mbox_conf_perm I1106 16:38:03.184350 13504 net.cpp:542] ctx_output1/relu_mbox_conf_flat -> ctx_output1/relu_mbox_conf_flat I1106 16:38:03.184391 13504 net.cpp:260] Setting up ctx_output1/relu_mbox_conf_flat I1106 16:38:03.184394 13504 net.cpp:267] TRAIN Top shape for layer 63 'ctx_output1/relu_mbox_conf_flat' 4 7680 (30720) I1106 16:38:03.184398 13504 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:03.184401 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.184412 13504 net.cpp:200] Created Layer ctx_output1/relu_mbox_priorbox (64) I1106 16:38:03.184415 13504 net.cpp:572] ctx_output1/relu_mbox_priorbox <- ctx_output1_ctx_output1/relu_0_split_2 I1106 16:38:03.184419 13504 net.cpp:572] ctx_output1/relu_mbox_priorbox <- data_data_0_split_1 I1106 16:38:03.184423 13504 net.cpp:542] ctx_output1/relu_mbox_priorbox -> ctx_output1/relu_mbox_priorbox I1106 16:38:03.184437 13504 net.cpp:260] Setting up ctx_output1/relu_mbox_priorbox I1106 16:38:03.184440 13504 net.cpp:267] TRAIN Top shape for layer 64 'ctx_output1/relu_mbox_priorbox' 1 2 15360 (30720) I1106 16:38:03.184450 13504 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc' of type 'Convolution' I1106 16:38:03.184453 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.184461 13504 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc (65) I1106 16:38:03.184464 13504 net.cpp:572] ctx_output2/relu_mbox_loc <- ctx_output2_ctx_output2/relu_0_split_0 I1106 16:38:03.184468 13504 net.cpp:542] ctx_output2/relu_mbox_loc -> ctx_output2/relu_mbox_loc I1106 16:38:03.184670 13504 net.cpp:260] Setting up ctx_output2/relu_mbox_loc I1106 16:38:03.184676 13504 net.cpp:267] TRAIN Top shape for layer 65 'ctx_output2/relu_mbox_loc' 4 24 10 24 (23040) I1106 16:38:03.184681 13504 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:03.184685 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.184692 13504 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc_perm (66) I1106 16:38:03.184695 13504 net.cpp:572] ctx_output2/relu_mbox_loc_perm <- ctx_output2/relu_mbox_loc I1106 16:38:03.184698 13504 net.cpp:542] ctx_output2/relu_mbox_loc_perm -> ctx_output2/relu_mbox_loc_perm I1106 16:38:03.184759 13504 net.cpp:260] Setting up ctx_output2/relu_mbox_loc_perm I1106 16:38:03.184764 13504 net.cpp:267] TRAIN Top shape for layer 66 'ctx_output2/relu_mbox_loc_perm' 4 10 24 24 (23040) I1106 16:38:03.184767 13504 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:03.184769 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.184774 13504 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc_flat (67) I1106 16:38:03.184777 13504 net.cpp:572] ctx_output2/relu_mbox_loc_flat <- ctx_output2/relu_mbox_loc_perm I1106 16:38:03.184780 13504 net.cpp:542] ctx_output2/relu_mbox_loc_flat -> ctx_output2/relu_mbox_loc_flat I1106 16:38:03.184818 13504 net.cpp:260] Setting up ctx_output2/relu_mbox_loc_flat I1106 16:38:03.184823 13504 net.cpp:267] TRAIN Top shape for layer 67 'ctx_output2/relu_mbox_loc_flat' 4 5760 (23040) I1106 16:38:03.184825 13504 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf' of type 'Convolution' I1106 16:38:03.184828 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.184839 13504 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf (68) I1106 16:38:03.184841 13504 net.cpp:572] ctx_output2/relu_mbox_conf <- ctx_output2_ctx_output2/relu_0_split_1 I1106 16:38:03.184845 13504 net.cpp:542] ctx_output2/relu_mbox_conf -> ctx_output2/relu_mbox_conf I1106 16:38:03.185014 13504 net.cpp:260] Setting up ctx_output2/relu_mbox_conf I1106 16:38:03.185020 13504 net.cpp:267] TRAIN Top shape for layer 68 'ctx_output2/relu_mbox_conf' 4 12 10 24 (11520) I1106 16:38:03.185024 13504 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:03.185029 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.185034 13504 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf_perm (69) I1106 16:38:03.185037 13504 net.cpp:572] ctx_output2/relu_mbox_conf_perm <- ctx_output2/relu_mbox_conf I1106 16:38:03.185041 13504 net.cpp:542] ctx_output2/relu_mbox_conf_perm -> ctx_output2/relu_mbox_conf_perm I1106 16:38:03.185097 13504 net.cpp:260] Setting up ctx_output2/relu_mbox_conf_perm I1106 16:38:03.185101 13504 net.cpp:267] TRAIN Top shape for layer 69 'ctx_output2/relu_mbox_conf_perm' 4 10 24 12 (11520) I1106 16:38:03.185104 13504 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:03.185107 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.185111 13504 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf_flat (70) I1106 16:38:03.185114 13504 net.cpp:572] ctx_output2/relu_mbox_conf_flat <- ctx_output2/relu_mbox_conf_perm I1106 16:38:03.185124 13504 net.cpp:542] ctx_output2/relu_mbox_conf_flat -> ctx_output2/relu_mbox_conf_flat I1106 16:38:03.185160 13504 net.cpp:260] Setting up ctx_output2/relu_mbox_conf_flat I1106 16:38:03.185165 13504 net.cpp:267] TRAIN Top shape for layer 70 'ctx_output2/relu_mbox_conf_flat' 4 2880 (11520) I1106 16:38:03.185168 13504 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:03.185171 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.185176 13504 net.cpp:200] Created Layer ctx_output2/relu_mbox_priorbox (71) I1106 16:38:03.185179 13504 net.cpp:572] ctx_output2/relu_mbox_priorbox <- ctx_output2_ctx_output2/relu_0_split_2 I1106 16:38:03.185184 13504 net.cpp:572] ctx_output2/relu_mbox_priorbox <- data_data_0_split_2 I1106 16:38:03.185186 13504 net.cpp:542] ctx_output2/relu_mbox_priorbox -> ctx_output2/relu_mbox_priorbox I1106 16:38:03.185199 13504 net.cpp:260] Setting up ctx_output2/relu_mbox_priorbox I1106 16:38:03.185204 13504 net.cpp:267] TRAIN Top shape for layer 71 'ctx_output2/relu_mbox_priorbox' 1 2 5760 (11520) I1106 16:38:03.185206 13504 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc' of type 'Convolution' I1106 16:38:03.185209 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.185217 13504 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc (72) I1106 16:38:03.185221 13504 net.cpp:572] ctx_output3/relu_mbox_loc <- ctx_output3_ctx_output3/relu_0_split_0 I1106 16:38:03.185225 13504 net.cpp:542] ctx_output3/relu_mbox_loc -> ctx_output3/relu_mbox_loc I1106 16:38:03.185416 13504 net.cpp:260] Setting up ctx_output3/relu_mbox_loc I1106 16:38:03.185422 13504 net.cpp:267] TRAIN Top shape for layer 72 'ctx_output3/relu_mbox_loc' 4 24 5 12 (5760) I1106 16:38:03.185427 13504 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:03.185431 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.185437 13504 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc_perm (73) I1106 16:38:03.185441 13504 net.cpp:572] ctx_output3/relu_mbox_loc_perm <- ctx_output3/relu_mbox_loc I1106 16:38:03.185444 13504 net.cpp:542] ctx_output3/relu_mbox_loc_perm -> ctx_output3/relu_mbox_loc_perm I1106 16:38:03.185501 13504 net.cpp:260] Setting up ctx_output3/relu_mbox_loc_perm I1106 16:38:03.185505 13504 net.cpp:267] TRAIN Top shape for layer 73 'ctx_output3/relu_mbox_loc_perm' 4 5 12 24 (5760) I1106 16:38:03.185508 13504 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:03.185513 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.185518 13504 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc_flat (74) I1106 16:38:03.185520 13504 net.cpp:572] ctx_output3/relu_mbox_loc_flat <- ctx_output3/relu_mbox_loc_perm I1106 16:38:03.185523 13504 net.cpp:542] ctx_output3/relu_mbox_loc_flat -> ctx_output3/relu_mbox_loc_flat I1106 16:38:03.185556 13504 net.cpp:260] Setting up ctx_output3/relu_mbox_loc_flat I1106 16:38:03.185560 13504 net.cpp:267] TRAIN Top shape for layer 74 'ctx_output3/relu_mbox_loc_flat' 4 1440 (5760) I1106 16:38:03.185564 13504 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf' of type 'Convolution' I1106 16:38:03.185566 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.185576 13504 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf (75) I1106 16:38:03.185580 13504 net.cpp:572] ctx_output3/relu_mbox_conf <- ctx_output3_ctx_output3/relu_0_split_1 I1106 16:38:03.185583 13504 net.cpp:542] ctx_output3/relu_mbox_conf -> ctx_output3/relu_mbox_conf I1106 16:38:03.185741 13504 net.cpp:260] Setting up ctx_output3/relu_mbox_conf I1106 16:38:03.185747 13504 net.cpp:267] TRAIN Top shape for layer 75 'ctx_output3/relu_mbox_conf' 4 12 5 12 (2880) I1106 16:38:03.185758 13504 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:03.185761 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.185767 13504 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf_perm (76) I1106 16:38:03.185771 13504 net.cpp:572] ctx_output3/relu_mbox_conf_perm <- ctx_output3/relu_mbox_conf I1106 16:38:03.185775 13504 net.cpp:542] ctx_output3/relu_mbox_conf_perm -> ctx_output3/relu_mbox_conf_perm I1106 16:38:03.185829 13504 net.cpp:260] Setting up ctx_output3/relu_mbox_conf_perm I1106 16:38:03.185834 13504 net.cpp:267] TRAIN Top shape for layer 76 'ctx_output3/relu_mbox_conf_perm' 4 5 12 12 (2880) I1106 16:38:03.185837 13504 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:03.185840 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.185844 13504 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf_flat (77) I1106 16:38:03.185847 13504 net.cpp:572] ctx_output3/relu_mbox_conf_flat <- ctx_output3/relu_mbox_conf_perm I1106 16:38:03.185851 13504 net.cpp:542] ctx_output3/relu_mbox_conf_flat -> ctx_output3/relu_mbox_conf_flat I1106 16:38:03.185889 13504 net.cpp:260] Setting up ctx_output3/relu_mbox_conf_flat I1106 16:38:03.185894 13504 net.cpp:267] TRAIN Top shape for layer 77 'ctx_output3/relu_mbox_conf_flat' 4 720 (2880) I1106 16:38:03.185896 13504 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:03.185899 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.185904 13504 net.cpp:200] Created Layer ctx_output3/relu_mbox_priorbox (78) I1106 16:38:03.185906 13504 net.cpp:572] ctx_output3/relu_mbox_priorbox <- ctx_output3_ctx_output3/relu_0_split_2 I1106 16:38:03.185910 13504 net.cpp:572] ctx_output3/relu_mbox_priorbox <- data_data_0_split_3 I1106 16:38:03.185914 13504 net.cpp:542] ctx_output3/relu_mbox_priorbox -> ctx_output3/relu_mbox_priorbox I1106 16:38:03.185928 13504 net.cpp:260] Setting up ctx_output3/relu_mbox_priorbox I1106 16:38:03.185932 13504 net.cpp:267] TRAIN Top shape for layer 78 'ctx_output3/relu_mbox_priorbox' 1 2 1440 (2880) I1106 16:38:03.185935 13504 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc' of type 'Convolution' I1106 16:38:03.185938 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.185947 13504 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc (79) I1106 16:38:03.185950 13504 net.cpp:572] ctx_output4/relu_mbox_loc <- ctx_output4_ctx_output4/relu_0_split_0 I1106 16:38:03.185955 13504 net.cpp:542] ctx_output4/relu_mbox_loc -> ctx_output4/relu_mbox_loc I1106 16:38:03.186121 13504 net.cpp:260] Setting up ctx_output4/relu_mbox_loc I1106 16:38:03.186127 13504 net.cpp:267] TRAIN Top shape for layer 79 'ctx_output4/relu_mbox_loc' 4 16 3 6 (1152) I1106 16:38:03.186132 13504 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:03.186136 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.186141 13504 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc_perm (80) I1106 16:38:03.186144 13504 net.cpp:572] ctx_output4/relu_mbox_loc_perm <- ctx_output4/relu_mbox_loc I1106 16:38:03.186148 13504 net.cpp:542] ctx_output4/relu_mbox_loc_perm -> ctx_output4/relu_mbox_loc_perm I1106 16:38:03.186204 13504 net.cpp:260] Setting up ctx_output4/relu_mbox_loc_perm I1106 16:38:03.186209 13504 net.cpp:267] TRAIN Top shape for layer 80 'ctx_output4/relu_mbox_loc_perm' 4 3 6 16 (1152) I1106 16:38:03.186213 13504 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:03.186215 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.186219 13504 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc_flat (81) I1106 16:38:03.186228 13504 net.cpp:572] ctx_output4/relu_mbox_loc_flat <- ctx_output4/relu_mbox_loc_perm I1106 16:38:03.186233 13504 net.cpp:542] ctx_output4/relu_mbox_loc_flat -> ctx_output4/relu_mbox_loc_flat I1106 16:38:03.186266 13504 net.cpp:260] Setting up ctx_output4/relu_mbox_loc_flat I1106 16:38:03.186271 13504 net.cpp:267] TRAIN Top shape for layer 81 'ctx_output4/relu_mbox_loc_flat' 4 288 (1152) I1106 16:38:03.186275 13504 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf' of type 'Convolution' I1106 16:38:03.186277 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.186286 13504 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf (82) I1106 16:38:03.186290 13504 net.cpp:572] ctx_output4/relu_mbox_conf <- ctx_output4_ctx_output4/relu_0_split_1 I1106 16:38:03.186293 13504 net.cpp:542] ctx_output4/relu_mbox_conf -> ctx_output4/relu_mbox_conf I1106 16:38:03.186442 13504 net.cpp:260] Setting up ctx_output4/relu_mbox_conf I1106 16:38:03.186447 13504 net.cpp:267] TRAIN Top shape for layer 82 'ctx_output4/relu_mbox_conf' 4 8 3 6 (576) I1106 16:38:03.186452 13504 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:03.186455 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.186462 13504 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf_perm (83) I1106 16:38:03.186465 13504 net.cpp:572] ctx_output4/relu_mbox_conf_perm <- ctx_output4/relu_mbox_conf I1106 16:38:03.186470 13504 net.cpp:542] ctx_output4/relu_mbox_conf_perm -> ctx_output4/relu_mbox_conf_perm I1106 16:38:03.186524 13504 net.cpp:260] Setting up ctx_output4/relu_mbox_conf_perm I1106 16:38:03.186529 13504 net.cpp:267] TRAIN Top shape for layer 83 'ctx_output4/relu_mbox_conf_perm' 4 3 6 8 (576) I1106 16:38:03.186532 13504 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:03.186534 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.186539 13504 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf_flat (84) I1106 16:38:03.186542 13504 net.cpp:572] ctx_output4/relu_mbox_conf_flat <- ctx_output4/relu_mbox_conf_perm I1106 16:38:03.186547 13504 net.cpp:542] ctx_output4/relu_mbox_conf_flat -> ctx_output4/relu_mbox_conf_flat I1106 16:38:03.186583 13504 net.cpp:260] Setting up ctx_output4/relu_mbox_conf_flat I1106 16:38:03.186587 13504 net.cpp:267] TRAIN Top shape for layer 84 'ctx_output4/relu_mbox_conf_flat' 4 144 (576) I1106 16:38:03.186590 13504 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:03.186594 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.186597 13504 net.cpp:200] Created Layer ctx_output4/relu_mbox_priorbox (85) I1106 16:38:03.186601 13504 net.cpp:572] ctx_output4/relu_mbox_priorbox <- ctx_output4_ctx_output4/relu_0_split_2 I1106 16:38:03.186604 13504 net.cpp:572] ctx_output4/relu_mbox_priorbox <- data_data_0_split_4 I1106 16:38:03.186607 13504 net.cpp:542] ctx_output4/relu_mbox_priorbox -> ctx_output4/relu_mbox_priorbox I1106 16:38:03.186621 13504 net.cpp:260] Setting up ctx_output4/relu_mbox_priorbox I1106 16:38:03.186626 13504 net.cpp:267] TRAIN Top shape for layer 85 'ctx_output4/relu_mbox_priorbox' 1 2 288 (576) I1106 16:38:03.186630 13504 layer_factory.hpp:172] Creating layer 'mbox_loc' of type 'Concat' I1106 16:38:03.186631 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.186637 13504 net.cpp:200] Created Layer mbox_loc (86) I1106 16:38:03.186641 13504 net.cpp:572] mbox_loc <- ctx_output1/relu_mbox_loc_flat I1106 16:38:03.186645 13504 net.cpp:572] mbox_loc <- ctx_output2/relu_mbox_loc_flat I1106 16:38:03.186647 13504 net.cpp:572] mbox_loc <- ctx_output3/relu_mbox_loc_flat I1106 16:38:03.186650 13504 net.cpp:572] mbox_loc <- ctx_output4/relu_mbox_loc_flat I1106 16:38:03.186661 13504 net.cpp:542] mbox_loc -> mbox_loc I1106 16:38:03.186677 13504 net.cpp:260] Setting up mbox_loc I1106 16:38:03.186681 13504 net.cpp:267] TRAIN Top shape for layer 86 'mbox_loc' 4 22848 (91392) I1106 16:38:03.186684 13504 layer_factory.hpp:172] Creating layer 'mbox_conf' of type 'Concat' I1106 16:38:03.186687 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.186691 13504 net.cpp:200] Created Layer mbox_conf (87) I1106 16:38:03.186693 13504 net.cpp:572] mbox_conf <- ctx_output1/relu_mbox_conf_flat I1106 16:38:03.186697 13504 net.cpp:572] mbox_conf <- ctx_output2/relu_mbox_conf_flat I1106 16:38:03.186700 13504 net.cpp:572] mbox_conf <- ctx_output3/relu_mbox_conf_flat I1106 16:38:03.186703 13504 net.cpp:572] mbox_conf <- ctx_output4/relu_mbox_conf_flat I1106 16:38:03.186707 13504 net.cpp:542] mbox_conf -> mbox_conf I1106 16:38:03.186720 13504 net.cpp:260] Setting up mbox_conf I1106 16:38:03.186724 13504 net.cpp:267] TRAIN Top shape for layer 87 'mbox_conf' 4 11424 (45696) I1106 16:38:03.186728 13504 layer_factory.hpp:172] Creating layer 'mbox_priorbox' of type 'Concat' I1106 16:38:03.186730 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.186733 13504 net.cpp:200] Created Layer mbox_priorbox (88) I1106 16:38:03.186738 13504 net.cpp:572] mbox_priorbox <- ctx_output1/relu_mbox_priorbox I1106 16:38:03.186740 13504 net.cpp:572] mbox_priorbox <- ctx_output2/relu_mbox_priorbox I1106 16:38:03.186743 13504 net.cpp:572] mbox_priorbox <- ctx_output3/relu_mbox_priorbox I1106 16:38:03.186746 13504 net.cpp:572] mbox_priorbox <- ctx_output4/relu_mbox_priorbox I1106 16:38:03.186749 13504 net.cpp:542] mbox_priorbox -> mbox_priorbox I1106 16:38:03.186764 13504 net.cpp:260] Setting up mbox_priorbox I1106 16:38:03.186769 13504 net.cpp:267] TRAIN Top shape for layer 88 'mbox_priorbox' 1 2 22848 (45696) I1106 16:38:03.186771 13504 layer_factory.hpp:172] Creating layer 'mbox_loss' of type 'MultiBoxLoss' I1106 16:38:03.186774 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.186782 13504 net.cpp:200] Created Layer mbox_loss (89) I1106 16:38:03.186785 13504 net.cpp:572] mbox_loss <- mbox_loc I1106 16:38:03.186789 13504 net.cpp:572] mbox_loss <- mbox_conf I1106 16:38:03.186792 13504 net.cpp:572] mbox_loss <- mbox_priorbox I1106 16:38:03.186795 13504 net.cpp:572] mbox_loss <- label I1106 16:38:03.186797 13504 net.cpp:542] mbox_loss -> mbox_loss I1106 16:38:03.186834 13504 layer_factory.hpp:172] Creating layer 'mbox_loss_smooth_L1_loc' of type 'SmoothL1Loss' I1106 16:38:03.186838 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.186895 13504 layer_factory.hpp:172] Creating layer 'mbox_loss_softmax_conf' of type 'SoftmaxWithLoss' I1106 16:38:03.186899 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.186962 13504 net.cpp:260] Setting up mbox_loss I1106 16:38:03.186967 13504 net.cpp:267] TRAIN Top shape for layer 89 'mbox_loss' (1) I1106 16:38:03.186969 13504 net.cpp:271] with loss weight 1 I1106 16:38:03.186982 13504 net.cpp:336] mbox_loss needs backward computation. I1106 16:38:03.186986 13504 net.cpp:338] mbox_priorbox does not need backward computation. I1106 16:38:03.186990 13504 net.cpp:336] mbox_conf needs backward computation. I1106 16:38:03.186995 13504 net.cpp:336] mbox_loc needs backward computation. I1106 16:38:03.186997 13504 net.cpp:338] ctx_output4/relu_mbox_priorbox does not need backward computation. I1106 16:38:03.187000 13504 net.cpp:336] ctx_output4/relu_mbox_conf_flat needs backward computation. I1106 16:38:03.187003 13504 net.cpp:336] ctx_output4/relu_mbox_conf_perm needs backward computation. I1106 16:38:03.187005 13504 net.cpp:336] ctx_output4/relu_mbox_conf needs backward computation. I1106 16:38:03.187007 13504 net.cpp:336] ctx_output4/relu_mbox_loc_flat needs backward computation. I1106 16:38:03.187016 13504 net.cpp:336] ctx_output4/relu_mbox_loc_perm needs backward computation. I1106 16:38:03.187019 13504 net.cpp:336] ctx_output4/relu_mbox_loc needs backward computation. I1106 16:38:03.187023 13504 net.cpp:338] ctx_output3/relu_mbox_priorbox does not need backward computation. I1106 16:38:03.187026 13504 net.cpp:336] ctx_output3/relu_mbox_conf_flat needs backward computation. I1106 16:38:03.187029 13504 net.cpp:336] ctx_output3/relu_mbox_conf_perm needs backward computation. I1106 16:38:03.187032 13504 net.cpp:336] ctx_output3/relu_mbox_conf needs backward computation. I1106 16:38:03.187034 13504 net.cpp:336] ctx_output3/relu_mbox_loc_flat needs backward computation. I1106 16:38:03.187036 13504 net.cpp:336] ctx_output3/relu_mbox_loc_perm needs backward computation. I1106 16:38:03.187038 13504 net.cpp:336] ctx_output3/relu_mbox_loc needs backward computation. I1106 16:38:03.187041 13504 net.cpp:338] ctx_output2/relu_mbox_priorbox does not need backward computation. I1106 16:38:03.187044 13504 net.cpp:336] ctx_output2/relu_mbox_conf_flat needs backward computation. I1106 16:38:03.187047 13504 net.cpp:336] ctx_output2/relu_mbox_conf_perm needs backward computation. I1106 16:38:03.187048 13504 net.cpp:336] ctx_output2/relu_mbox_conf needs backward computation. I1106 16:38:03.187052 13504 net.cpp:336] ctx_output2/relu_mbox_loc_flat needs backward computation. I1106 16:38:03.187054 13504 net.cpp:336] ctx_output2/relu_mbox_loc_perm needs backward computation. I1106 16:38:03.187057 13504 net.cpp:336] ctx_output2/relu_mbox_loc needs backward computation. I1106 16:38:03.187058 13504 net.cpp:338] ctx_output1/relu_mbox_priorbox does not need backward computation. I1106 16:38:03.187062 13504 net.cpp:336] ctx_output1/relu_mbox_conf_flat needs backward computation. I1106 16:38:03.187064 13504 net.cpp:336] ctx_output1/relu_mbox_conf_perm needs backward computation. I1106 16:38:03.187067 13504 net.cpp:336] ctx_output1/relu_mbox_conf needs backward computation. I1106 16:38:03.187069 13504 net.cpp:336] ctx_output1/relu_mbox_loc_flat needs backward computation. I1106 16:38:03.187072 13504 net.cpp:336] ctx_output1/relu_mbox_loc_perm needs backward computation. I1106 16:38:03.187074 13504 net.cpp:336] ctx_output1/relu_mbox_loc needs backward computation. I1106 16:38:03.187077 13504 net.cpp:338] ctx_output5/relu does not need backward computation. I1106 16:38:03.187078 13504 net.cpp:338] ctx_output5 does not need backward computation. I1106 16:38:03.187081 13504 net.cpp:336] ctx_output4_ctx_output4/relu_0_split needs backward computation. I1106 16:38:03.187085 13504 net.cpp:336] ctx_output4/relu needs backward computation. I1106 16:38:03.187088 13504 net.cpp:336] ctx_output4 needs backward computation. I1106 16:38:03.187090 13504 net.cpp:336] ctx_output3_ctx_output3/relu_0_split needs backward computation. I1106 16:38:03.187093 13504 net.cpp:336] ctx_output3/relu needs backward computation. I1106 16:38:03.187095 13504 net.cpp:336] ctx_output3 needs backward computation. I1106 16:38:03.187098 13504 net.cpp:336] ctx_output2_ctx_output2/relu_0_split needs backward computation. I1106 16:38:03.187100 13504 net.cpp:336] ctx_output2/relu needs backward computation. I1106 16:38:03.187103 13504 net.cpp:336] ctx_output2 needs backward computation. I1106 16:38:03.187105 13504 net.cpp:336] ctx_output1_ctx_output1/relu_0_split needs backward computation. I1106 16:38:03.187108 13504 net.cpp:336] ctx_output1/relu needs backward computation. I1106 16:38:03.187109 13504 net.cpp:336] ctx_output1 needs backward computation. I1106 16:38:03.187114 13504 net.cpp:338] pool8 does not need backward computation. I1106 16:38:03.187116 13504 net.cpp:336] pool7_pool7_0_split needs backward computation. I1106 16:38:03.187119 13504 net.cpp:336] pool7 needs backward computation. I1106 16:38:03.187122 13504 net.cpp:336] pool6_pool6_0_split needs backward computation. I1106 16:38:03.187125 13504 net.cpp:336] pool6 needs backward computation. I1106 16:38:03.187127 13504 net.cpp:336] res5a_branch2b_res5a_branch2b/relu_0_split needs backward computation. I1106 16:38:03.187134 13504 net.cpp:336] res5a_branch2b/relu needs backward computation. I1106 16:38:03.187137 13504 net.cpp:336] res5a_branch2b/bn needs backward computation. I1106 16:38:03.187140 13504 net.cpp:336] res5a_branch2b needs backward computation. I1106 16:38:03.187142 13504 net.cpp:336] res5a_branch2a/relu needs backward computation. I1106 16:38:03.187145 13504 net.cpp:336] res5a_branch2a/bn needs backward computation. I1106 16:38:03.187147 13504 net.cpp:336] res5a_branch2a needs backward computation. I1106 16:38:03.187150 13504 net.cpp:336] pool4 needs backward computation. I1106 16:38:03.187153 13504 net.cpp:336] res4a_branch2b_res4a_branch2b/relu_0_split needs backward computation. I1106 16:38:03.187156 13504 net.cpp:336] res4a_branch2b/relu needs backward computation. I1106 16:38:03.187157 13504 net.cpp:336] res4a_branch2b/bn needs backward computation. I1106 16:38:03.187160 13504 net.cpp:336] res4a_branch2b needs backward computation. I1106 16:38:03.187162 13504 net.cpp:336] res4a_branch2a/relu needs backward computation. I1106 16:38:03.187165 13504 net.cpp:336] res4a_branch2a/bn needs backward computation. I1106 16:38:03.187168 13504 net.cpp:336] res4a_branch2a needs backward computation. I1106 16:38:03.187170 13504 net.cpp:336] pool3 needs backward computation. I1106 16:38:03.187173 13504 net.cpp:336] res3a_branch2b/relu needs backward computation. I1106 16:38:03.187175 13504 net.cpp:336] res3a_branch2b/bn needs backward computation. I1106 16:38:03.187177 13504 net.cpp:336] res3a_branch2b needs backward computation. I1106 16:38:03.187180 13504 net.cpp:336] res3a_branch2a/relu needs backward computation. I1106 16:38:03.187182 13504 net.cpp:336] res3a_branch2a/bn needs backward computation. I1106 16:38:03.187184 13504 net.cpp:336] res3a_branch2a needs backward computation. I1106 16:38:03.187186 13504 net.cpp:336] pool2 needs backward computation. I1106 16:38:03.187189 13504 net.cpp:336] res2a_branch2b/relu needs backward computation. I1106 16:38:03.187191 13504 net.cpp:336] res2a_branch2b/bn needs backward computation. I1106 16:38:03.187193 13504 net.cpp:336] res2a_branch2b needs backward computation. I1106 16:38:03.187196 13504 net.cpp:336] res2a_branch2a/relu needs backward computation. I1106 16:38:03.187199 13504 net.cpp:336] res2a_branch2a/bn needs backward computation. I1106 16:38:03.187201 13504 net.cpp:336] res2a_branch2a needs backward computation. I1106 16:38:03.187204 13504 net.cpp:336] pool1 needs backward computation. I1106 16:38:03.187206 13504 net.cpp:336] conv1b/relu needs backward computation. I1106 16:38:03.187209 13504 net.cpp:336] conv1b/bn needs backward computation. I1106 16:38:03.187211 13504 net.cpp:336] conv1b needs backward computation. I1106 16:38:03.187213 13504 net.cpp:336] conv1a/relu needs backward computation. I1106 16:38:03.187216 13504 net.cpp:336] conv1a/bn needs backward computation. I1106 16:38:03.187217 13504 net.cpp:336] conv1a needs backward computation. I1106 16:38:03.187220 13504 net.cpp:338] data/bias does not need backward computation. I1106 16:38:03.187223 13504 net.cpp:338] data_data_0_split does not need backward computation. I1106 16:38:03.187227 13504 net.cpp:338] data does not need backward computation. I1106 16:38:03.187229 13504 net.cpp:380] This network produces output ctx_output5 I1106 16:38:03.187232 13504 net.cpp:380] This network produces output mbox_loss I1106 16:38:03.187289 13504 net.cpp:403] Top memory (TRAIN) required for data: 505556136 diff: 505556136 I1106 16:38:03.187292 13504 net.cpp:406] Bottom memory (TRAIN) required for data: 505531552 diff: 505531552 I1106 16:38:03.187295 13504 net.cpp:409] Shared (in-place) memory (TRAIN) by data: 249053184 diff: 249053184 I1106 16:38:03.187297 13504 net.cpp:412] Parameters memory (TRAIN) required for data: 11946688 diff: 11946688 I1106 16:38:03.187299 13504 net.cpp:415] Parameters shared memory (TRAIN) by data: 0 diff: 0 I1106 16:38:03.187301 13504 net.cpp:421] Network initialization done. I1106 16:38:03.187847 13504 solver.cpp:175] Creating test net (#0) specified by test_net file: training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/initial/test.prototxt I1106 16:38:03.188271 13504 net.cpp:80] Initializing net from parameters: name: "ssdJacintoNetV2_test" state { phase: TEST } layer { name: "data" type: "AnnotatedData" top: "data" top: "label" include { phase: TEST } transform_param { mean_value: 0 mean_value: 0 mean_value: 0 force_color: false resize_param { prob: 1 resize_mode: WARP height: 320 width: 768 interp_mode: LINEAR } crop_h: 320 crop_w: 768 } data_param { source: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_test_lmdb" batch_size: 8 backend: LMDB threads: 4 parser_threads: 4 } annotated_data_param { batch_sampler { } label_map_file: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/labelmap.prototxt" } } layer { name: "data/bias" type: "Bias" bottom: "data" top: "data/bias" param { lr_mult: 0 decay_mult: 0 } bias_param { filler { type: "constant" value: -128 } } } layer { name: "conv1a" type: "Convolution" bottom: "data/bias" top: "conv1a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 32 bias_term: true pad: 2 kernel_size: 5 group: 1 stride: 2 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "conv1a/bn" type: "BatchNorm" bottom: "conv1a" top: "conv1a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "conv1a/relu" type: "ReLU" bottom: "conv1a" top: "conv1a" } layer { name: "conv1b" type: "Convolution" bottom: "conv1a" top: "conv1b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 32 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "conv1b/bn" type: "BatchNorm" bottom: "conv1b" top: "conv1b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "conv1b/relu" type: "ReLU" bottom: "conv1b" top: "conv1b" } layer { name: "pool1" type: "Pooling" bottom: "conv1b" top: "pool1" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res2a_branch2a" type: "Convolution" bottom: "pool1" top: "res2a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 64 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res2a_branch2a/bn" type: "BatchNorm" bottom: "res2a_branch2a" top: "res2a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res2a_branch2a/relu" type: "ReLU" bottom: "res2a_branch2a" top: "res2a_branch2a" } layer { name: "res2a_branch2b" type: "Convolution" bottom: "res2a_branch2a" top: "res2a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 64 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res2a_branch2b/bn" type: "BatchNorm" bottom: "res2a_branch2b" top: "res2a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res2a_branch2b/relu" type: "ReLU" bottom: "res2a_branch2b" top: "res2a_branch2b" } layer { name: "pool2" type: "Pooling" bottom: "res2a_branch2b" top: "pool2" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res3a_branch2a" type: "Convolution" bottom: "pool2" top: "res3a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 128 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res3a_branch2a/bn" type: "BatchNorm" bottom: "res3a_branch2a" top: "res3a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res3a_branch2a/relu" type: "ReLU" bottom: "res3a_branch2a" top: "res3a_branch2a" } layer { name: "res3a_branch2b" type: "Convolution" bottom: "res3a_branch2a" top: "res3a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 128 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res3a_branch2b/bn" type: "BatchNorm" bottom: "res3a_branch2b" top: "res3a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res3a_branch2b/relu" type: "ReLU" bottom: "res3a_branch2b" top: "res3a_branch2b" } layer { name: "pool3" type: "Pooling" bottom: "res3a_branch2b" top: "pool3" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res4a_branch2a" type: "Convolution" bottom: "pool3" top: "res4a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res4a_branch2a/bn" type: "BatchNorm" bottom: "res4a_branch2a" top: "res4a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res4a_branch2a/relu" type: "ReLU" bottom: "res4a_branch2a" top: "res4a_branch2a" } layer { name: "res4a_branch2b" type: "Convolution" bottom: "res4a_branch2a" top: "res4a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res4a_branch2b/bn" type: "BatchNorm" bottom: "res4a_branch2b" top: "res4a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res4a_branch2b/relu" type: "ReLU" bottom: "res4a_branch2b" top: "res4a_branch2b" } layer { name: "pool4" type: "Pooling" bottom: "res4a_branch2b" top: "pool4" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res5a_branch2a" type: "Convolution" bottom: "pool4" top: "res5a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 512 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res5a_branch2a/bn" type: "BatchNorm" bottom: "res5a_branch2a" top: "res5a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res5a_branch2a/relu" type: "ReLU" bottom: "res5a_branch2a" top: "res5a_branch2a" } layer { name: "res5a_branch2b" type: "Convolution" bottom: "res5a_branch2a" top: "res5a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 512 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res5a_branch2b/bn" type: "BatchNorm" bottom: "res5a_branch2b" top: "res5a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res5a_branch2b/relu" type: "ReLU" bottom: "res5a_branch2b" top: "res5a_branch2b" } layer { name: "pool6" type: "Pooling" bottom: "res5a_branch2b" top: "pool6" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "pool7" type: "Pooling" bottom: "pool6" top: "pool7" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "pool8" type: "Pooling" bottom: "pool7" top: "pool8" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "ctx_output1" type: "Convolution" bottom: "res4a_branch2b" top: "ctx_output1" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu" type: "ReLU" bottom: "ctx_output1" top: "ctx_output1" } layer { name: "ctx_output2" type: "Convolution" bottom: "res5a_branch2b" top: "ctx_output2" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu" type: "ReLU" bottom: "ctx_output2" top: "ctx_output2" } layer { name: "ctx_output3" type: "Convolution" bottom: "pool6" top: "ctx_output3" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu" type: "ReLU" bottom: "ctx_output3" top: "ctx_output3" } layer { name: "ctx_output4" type: "Convolution" bottom: "pool7" top: "ctx_output4" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu" type: "ReLU" bottom: "ctx_output4" top: "ctx_output4" } layer { name: "ctx_output5" type: "Convolution" bottom: "pool8" top: "ctx_output5" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output5/relu" type: "ReLU" bottom: "ctx_output5" top: "ctx_output5" } layer { name: "ctx_output1/relu_mbox_loc" type: "Convolution" bottom: "ctx_output1" top: "ctx_output1/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 16 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output1/relu_mbox_loc" top: "ctx_output1/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output1/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output1/relu_mbox_loc_perm" top: "ctx_output1/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output1/relu_mbox_conf" type: "Convolution" bottom: "ctx_output1" top: "ctx_output1/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 8 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output1/relu_mbox_conf" top: "ctx_output1/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output1/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output1/relu_mbox_conf_perm" top: "ctx_output1/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output1/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output1" bottom: "data" top: "ctx_output1/relu_mbox_priorbox" prior_box_param { min_size: 14.72 max_size: 36.8 aspect_ratio: 2 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output2/relu_mbox_loc" type: "Convolution" bottom: "ctx_output2" top: "ctx_output2/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 24 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output2/relu_mbox_loc" top: "ctx_output2/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output2/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output2/relu_mbox_loc_perm" top: "ctx_output2/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output2/relu_mbox_conf" type: "Convolution" bottom: "ctx_output2" top: "ctx_output2/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 12 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output2/relu_mbox_conf" top: "ctx_output2/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output2/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output2/relu_mbox_conf_perm" top: "ctx_output2/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output2/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output2" bottom: "data" top: "ctx_output2/relu_mbox_priorbox" prior_box_param { min_size: 36.8 max_size: 132.48 aspect_ratio: 2 aspect_ratio: 3 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output3/relu_mbox_loc" type: "Convolution" bottom: "ctx_output3" top: "ctx_output3/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 24 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output3/relu_mbox_loc" top: "ctx_output3/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output3/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output3/relu_mbox_loc_perm" top: "ctx_output3/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output3/relu_mbox_conf" type: "Convolution" bottom: "ctx_output3" top: "ctx_output3/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 12 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output3/relu_mbox_conf" top: "ctx_output3/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output3/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output3/relu_mbox_conf_perm" top: "ctx_output3/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output3/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output3" bottom: "data" top: "ctx_output3/relu_mbox_priorbox" prior_box_param { min_size: 132.48 max_size: 228.16 aspect_ratio: 2 aspect_ratio: 3 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output4/relu_mbox_loc" type: "Convolution" bottom: "ctx_output4" top: "ctx_output4/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 16 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output4/relu_mbox_loc" top: "ctx_output4/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output4/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output4/relu_mbox_loc_perm" top: "ctx_output4/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output4/relu_mbox_conf" type: "Convolution" bottom: "ctx_output4" top: "ctx_output4/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 8 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output4/relu_mbox_conf" top: "ctx_output4/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output4/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output4/relu_mbox_conf_perm" top: "ctx_output4/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output4/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output4" bottom: "data" top: "ctx_output4/relu_mbox_priorbox" prior_box_param { min_size: 228.16 max_size: 323.84 aspect_ratio: 2 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "mbox_loc" type: "Concat" bottom: "ctx_output1/relu_mbox_loc_flat" bottom: "ctx_output2/relu_mbox_loc_flat" bottom: "ctx_output3/relu_mbox_loc_flat" bottom: "ctx_output4/relu_mbox_loc_flat" top: "mbox_loc" concat_param { axis: 1 } } layer { name: "mbox_conf" type: "Concat" bottom: "ctx_output1/relu_mbox_conf_flat" bottom: "ctx_output2/relu_mbox_conf_flat" bottom: "ctx_output3/relu_mbox_conf_flat" bottom: "ctx_output4/relu_mbox_conf_flat" top: "mbox_conf" concat_param { axis: 1 } } layer { name: "mbox_priorbox" type: "Concat" bottom: "ctx_output1/relu_mbox_priorbox" bottom: "ctx_output2/relu_mbox_priorbox" bottom: "ctx_output3/relu_mbox_priorbox" bottom: "ctx_output4/relu_mbox_priorbox" top: "mbox_priorbox" concat_param { axis: 2 } } layer { name: "mbox_conf_reshape" type: "Reshape" bottom: "mbox_conf" top: "mbox_conf_reshape" reshape_param { shape { dim: 0 dim: -1 dim: 2 } } } layer { name: "mbox_conf_softmax" type: "Softmax" bottom: "mbox_conf_reshape" top: "mbox_conf_softmax" softmax_param { axis: 2 } } layer { name: "mbox_conf_flatten" type: "Flatten" bottom: "mbox_conf_softmax" top: "mbox_conf_flatten" flatten_param { axis: 1 } } layer { name: "detection_out" type: "DetectionOutput" bottom: "mbox_loc" bottom: "mbox_conf_flatten" bottom: "mbox_priorbox" top: "detection_out" include { phase: TEST } detection_output_param { num_classes: 2 share_location: true background_label_id: 0 nms_param { nms_threshold: 0.45 top_k: 400 } save_output_param { output_directory: "" output_name_prefix: "comp4_det_test_" output_format: "VOC" label_map_file: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/labelmap.prototxt" name_size_file: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/test_name_size.txt" num_test_image: 24 } code_type: CENTER_SIZE keep_top_k: 200 confidence_threshold: 0.01 } } layer { name: "detection_eval" type: "DetectionEvaluate" bottom: "detection_out" bottom: "label" top: "detection_eval" include { phase: TEST } detection_evaluate_param { num_classes: 2 background_label_id: 0 overlap_threshold: 0.5 evaluate_difficult_gt: false name_size_file: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/test_name_size.txt" } } I1106 16:38:03.188522 13504 net.cpp:110] Using FLOAT as default forward math type I1106 16:38:03.188527 13504 net.cpp:116] Using FLOAT as default backward math type I1106 16:38:03.188545 13504 layer_factory.hpp:172] Creating layer 'data' of type 'AnnotatedData' I1106 16:38:03.188547 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.188560 13504 internal_thread.cpp:19] Starting 1 internal thread(s) on device 0 I1106 16:38:03.188714 13504 net.cpp:200] Created Layer data (0) I1106 16:38:03.188719 13504 net.cpp:542] data -> data I1106 16:38:03.188737 13504 net.cpp:542] data -> label I1106 16:38:03.188742 13504 data_reader.cpp:58] Data Reader threads: 1, out queues: 1, depth: 8 I1106 16:38:03.189071 13504 internal_thread.cpp:19] Starting 1 internal thread(s) on device 0 I1106 16:38:03.189576 13534 db_lmdb.cpp:36] Opened lmdb /home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_test_lmdb I1106 16:38:03.190003 13504 annotated_data_layer.cpp:105] output data size: 8,3,320,768 I1106 16:38:03.190045 13504 annotated_data_layer.cpp:150] (0) Output data size: 8, 3, 320, 768 I1106 16:38:03.190068 13504 internal_thread.cpp:19] Starting 1 internal thread(s) on device 0 I1106 16:38:03.190088 13504 net.cpp:260] Setting up data I1106 16:38:03.190093 13504 net.cpp:267] TEST Top shape for layer 0 'data' 8 3 320 768 (5898240) I1106 16:38:03.190129 13504 net.cpp:267] TEST Top shape for layer 0 'data' 1 1 2 8 (16) I1106 16:38:03.190132 13504 layer_factory.hpp:172] Creating layer 'data_data_0_split' of type 'Split' I1106 16:38:03.190407 13535 data_layer.cpp:105] (0) Parser threads: 1 I1106 16:38:03.190412 13535 data_layer.cpp:107] (0) Transformer threads: 1 I1106 16:38:03.190434 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.190461 13504 net.cpp:200] Created Layer data_data_0_split (1) I1106 16:38:03.190465 13504 net.cpp:572] data_data_0_split <- data I1106 16:38:03.190469 13504 net.cpp:542] data_data_0_split -> data_data_0_split_0 I1106 16:38:03.190474 13504 net.cpp:542] data_data_0_split -> data_data_0_split_1 I1106 16:38:03.190477 13504 net.cpp:542] data_data_0_split -> data_data_0_split_2 I1106 16:38:03.190480 13504 net.cpp:542] data_data_0_split -> data_data_0_split_3 I1106 16:38:03.190484 13504 net.cpp:542] data_data_0_split -> data_data_0_split_4 I1106 16:38:03.190536 13504 net.cpp:260] Setting up data_data_0_split I1106 16:38:03.190541 13504 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 8 3 320 768 (5898240) I1106 16:38:03.190543 13504 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 8 3 320 768 (5898240) I1106 16:38:03.190546 13504 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 8 3 320 768 (5898240) I1106 16:38:03.190551 13504 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 8 3 320 768 (5898240) I1106 16:38:03.190552 13504 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 8 3 320 768 (5898240) I1106 16:38:03.190555 13504 layer_factory.hpp:172] Creating layer 'data/bias' of type 'Bias' I1106 16:38:03.190557 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.190563 13504 net.cpp:200] Created Layer data/bias (2) I1106 16:38:03.190567 13504 net.cpp:572] data/bias <- data_data_0_split_0 I1106 16:38:03.190568 13504 net.cpp:542] data/bias -> data/bias I1106 16:38:03.191195 13504 net.cpp:260] Setting up data/bias I1106 16:38:03.191205 13504 net.cpp:267] TEST Top shape for layer 2 'data/bias' 8 3 320 768 (5898240) I1106 16:38:03.191210 13504 layer_factory.hpp:172] Creating layer 'conv1a' of type 'Convolution' I1106 16:38:03.191212 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.191220 13504 net.cpp:200] Created Layer conv1a (3) I1106 16:38:03.191223 13504 net.cpp:572] conv1a <- data/bias I1106 16:38:03.191226 13504 net.cpp:542] conv1a -> conv1a I1106 16:38:03.195612 13504 net.cpp:260] Setting up conv1a I1106 16:38:03.195660 13504 net.cpp:267] TEST Top shape for layer 3 'conv1a' 8 32 160 384 (15728640) I1106 16:38:03.195677 13504 layer_factory.hpp:172] Creating layer 'conv1a/bn' of type 'BatchNorm' I1106 16:38:03.195700 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.195719 13504 net.cpp:200] Created Layer conv1a/bn (4) I1106 16:38:03.195722 13504 net.cpp:572] conv1a/bn <- conv1a I1106 16:38:03.195729 13504 net.cpp:527] conv1a/bn -> conv1a (in-place) I1106 16:38:03.196069 13504 net.cpp:260] Setting up conv1a/bn I1106 16:38:03.196076 13504 net.cpp:267] TEST Top shape for layer 4 'conv1a/bn' 8 32 160 384 (15728640) I1106 16:38:03.196084 13504 layer_factory.hpp:172] Creating layer 'conv1a/relu' of type 'ReLU' I1106 16:38:03.196089 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.196094 13504 net.cpp:200] Created Layer conv1a/relu (5) I1106 16:38:03.196095 13504 net.cpp:572] conv1a/relu <- conv1a I1106 16:38:03.196099 13504 net.cpp:527] conv1a/relu -> conv1a (in-place) I1106 16:38:03.196117 13504 net.cpp:260] Setting up conv1a/relu I1106 16:38:03.196125 13504 net.cpp:267] TEST Top shape for layer 5 'conv1a/relu' 8 32 160 384 (15728640) I1106 16:38:03.196130 13504 layer_factory.hpp:172] Creating layer 'conv1b' of type 'Convolution' I1106 16:38:03.196161 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.196185 13504 net.cpp:200] Created Layer conv1b (6) I1106 16:38:03.196187 13504 net.cpp:572] conv1b <- conv1a I1106 16:38:03.196192 13504 net.cpp:542] conv1b -> conv1b I1106 16:38:03.196554 13504 net.cpp:260] Setting up conv1b I1106 16:38:03.196566 13504 net.cpp:267] TEST Top shape for layer 6 'conv1b' 8 32 160 384 (15728640) I1106 16:38:03.196581 13504 layer_factory.hpp:172] Creating layer 'conv1b/bn' of type 'BatchNorm' I1106 16:38:03.196585 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.196593 13504 net.cpp:200] Created Layer conv1b/bn (7) I1106 16:38:03.196597 13504 net.cpp:572] conv1b/bn <- conv1b I1106 16:38:03.196599 13504 net.cpp:527] conv1b/bn -> conv1b (in-place) I1106 16:38:03.196969 13504 net.cpp:260] Setting up conv1b/bn I1106 16:38:03.196977 13504 net.cpp:267] TEST Top shape for layer 7 'conv1b/bn' 8 32 160 384 (15728640) I1106 16:38:03.196986 13504 layer_factory.hpp:172] Creating layer 'conv1b/relu' of type 'ReLU' I1106 16:38:03.196990 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.196997 13504 net.cpp:200] Created Layer conv1b/relu (8) I1106 16:38:03.197000 13504 net.cpp:572] conv1b/relu <- conv1b I1106 16:38:03.197002 13504 net.cpp:527] conv1b/relu -> conv1b (in-place) I1106 16:38:03.197006 13504 net.cpp:260] Setting up conv1b/relu I1106 16:38:03.197008 13504 net.cpp:267] TEST Top shape for layer 8 'conv1b/relu' 8 32 160 384 (15728640) I1106 16:38:03.197011 13504 layer_factory.hpp:172] Creating layer 'pool1' of type 'Pooling' I1106 16:38:03.197029 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.197047 13504 net.cpp:200] Created Layer pool1 (9) I1106 16:38:03.197050 13504 net.cpp:572] pool1 <- conv1b I1106 16:38:03.197053 13504 net.cpp:542] pool1 -> pool1 I1106 16:38:03.197104 13504 net.cpp:260] Setting up pool1 I1106 16:38:03.197108 13504 net.cpp:267] TEST Top shape for layer 9 'pool1' 8 32 80 192 (3932160) I1106 16:38:03.197110 13504 layer_factory.hpp:172] Creating layer 'res2a_branch2a' of type 'Convolution' I1106 16:38:03.197113 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.197131 13504 net.cpp:200] Created Layer res2a_branch2a (10) I1106 16:38:03.197134 13504 net.cpp:572] res2a_branch2a <- pool1 I1106 16:38:03.197136 13504 net.cpp:542] res2a_branch2a -> res2a_branch2a I1106 16:38:03.197546 13504 net.cpp:260] Setting up res2a_branch2a I1106 16:38:03.197574 13504 net.cpp:267] TEST Top shape for layer 10 'res2a_branch2a' 8 64 80 192 (7864320) I1106 16:38:03.197593 13504 layer_factory.hpp:172] Creating layer 'res2a_branch2a/bn' of type 'BatchNorm' I1106 16:38:03.197603 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.197624 13504 net.cpp:200] Created Layer res2a_branch2a/bn (11) I1106 16:38:03.197633 13504 net.cpp:572] res2a_branch2a/bn <- res2a_branch2a I1106 16:38:03.197639 13504 net.cpp:527] res2a_branch2a/bn -> res2a_branch2a (in-place) I1106 16:38:03.197927 13504 net.cpp:260] Setting up res2a_branch2a/bn I1106 16:38:03.197942 13504 net.cpp:267] TEST Top shape for layer 11 'res2a_branch2a/bn' 8 64 80 192 (7864320) I1106 16:38:03.197952 13504 layer_factory.hpp:172] Creating layer 'res2a_branch2a/relu' of type 'ReLU' I1106 16:38:03.197957 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.197964 13504 net.cpp:200] Created Layer res2a_branch2a/relu (12) I1106 16:38:03.197970 13504 net.cpp:572] res2a_branch2a/relu <- res2a_branch2a I1106 16:38:03.197976 13504 net.cpp:527] res2a_branch2a/relu -> res2a_branch2a (in-place) I1106 16:38:03.197985 13504 net.cpp:260] Setting up res2a_branch2a/relu I1106 16:38:03.197996 13504 net.cpp:267] TEST Top shape for layer 12 'res2a_branch2a/relu' 8 64 80 192 (7864320) I1106 16:38:03.198012 13504 layer_factory.hpp:172] Creating layer 'res2a_branch2b' of type 'Convolution' I1106 16:38:03.198019 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.198031 13504 net.cpp:200] Created Layer res2a_branch2b (13) I1106 16:38:03.198037 13504 net.cpp:572] res2a_branch2b <- res2a_branch2a I1106 16:38:03.198043 13504 net.cpp:542] res2a_branch2b -> res2a_branch2b I1106 16:38:03.199194 13504 net.cpp:260] Setting up res2a_branch2b I1106 16:38:03.199205 13504 net.cpp:267] TEST Top shape for layer 13 'res2a_branch2b' 8 64 80 192 (7864320) I1106 16:38:03.199210 13504 layer_factory.hpp:172] Creating layer 'res2a_branch2b/bn' of type 'BatchNorm' I1106 16:38:03.199213 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.199218 13504 net.cpp:200] Created Layer res2a_branch2b/bn (14) I1106 16:38:03.199220 13504 net.cpp:572] res2a_branch2b/bn <- res2a_branch2b I1106 16:38:03.199223 13504 net.cpp:527] res2a_branch2b/bn -> res2a_branch2b (in-place) I1106 16:38:03.199510 13504 net.cpp:260] Setting up res2a_branch2b/bn I1106 16:38:03.199525 13504 net.cpp:267] TEST Top shape for layer 14 'res2a_branch2b/bn' 8 64 80 192 (7864320) I1106 16:38:03.199535 13504 layer_factory.hpp:172] Creating layer 'res2a_branch2b/relu' of type 'ReLU' I1106 16:38:03.199542 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.199550 13504 net.cpp:200] Created Layer res2a_branch2b/relu (15) I1106 16:38:03.199556 13504 net.cpp:572] res2a_branch2b/relu <- res2a_branch2b I1106 16:38:03.199563 13504 net.cpp:527] res2a_branch2b/relu -> res2a_branch2b (in-place) I1106 16:38:03.199569 13504 net.cpp:260] Setting up res2a_branch2b/relu I1106 16:38:03.199575 13504 net.cpp:267] TEST Top shape for layer 15 'res2a_branch2b/relu' 8 64 80 192 (7864320) I1106 16:38:03.199581 13504 layer_factory.hpp:172] Creating layer 'pool2' of type 'Pooling' I1106 16:38:03.199587 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.199599 13504 net.cpp:200] Created Layer pool2 (16) I1106 16:38:03.199604 13504 net.cpp:572] pool2 <- res2a_branch2b I1106 16:38:03.199610 13504 net.cpp:542] pool2 -> pool2 I1106 16:38:03.199645 13504 net.cpp:260] Setting up pool2 I1106 16:38:03.199653 13504 net.cpp:267] TEST Top shape for layer 16 'pool2' 8 64 40 96 (1966080) I1106 16:38:03.199661 13504 layer_factory.hpp:172] Creating layer 'res3a_branch2a' of type 'Convolution' I1106 16:38:03.199666 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.199712 13504 net.cpp:200] Created Layer res3a_branch2a (17) I1106 16:38:03.199719 13504 net.cpp:572] res3a_branch2a <- pool2 I1106 16:38:03.199726 13504 net.cpp:542] res3a_branch2a -> res3a_branch2a I1106 16:38:03.200382 13504 net.cpp:260] Setting up res3a_branch2a I1106 16:38:03.200389 13504 net.cpp:267] TEST Top shape for layer 17 'res3a_branch2a' 8 128 40 96 (3932160) I1106 16:38:03.200393 13504 layer_factory.hpp:172] Creating layer 'res3a_branch2a/bn' of type 'BatchNorm' I1106 16:38:03.200397 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.200402 13504 net.cpp:200] Created Layer res3a_branch2a/bn (18) I1106 16:38:03.200407 13504 net.cpp:572] res3a_branch2a/bn <- res3a_branch2a I1106 16:38:03.200409 13504 net.cpp:527] res3a_branch2a/bn -> res3a_branch2a (in-place) I1106 16:38:03.200598 13504 net.cpp:260] Setting up res3a_branch2a/bn I1106 16:38:03.200603 13504 net.cpp:267] TEST Top shape for layer 18 'res3a_branch2a/bn' 8 128 40 96 (3932160) I1106 16:38:03.200613 13504 layer_factory.hpp:172] Creating layer 'res3a_branch2a/relu' of type 'ReLU' I1106 16:38:03.200623 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.200630 13504 net.cpp:200] Created Layer res3a_branch2a/relu (19) I1106 16:38:03.200634 13504 net.cpp:572] res3a_branch2a/relu <- res3a_branch2a I1106 16:38:03.200649 13504 net.cpp:527] res3a_branch2a/relu -> res3a_branch2a (in-place) I1106 16:38:03.200657 13504 net.cpp:260] Setting up res3a_branch2a/relu I1106 16:38:03.200661 13504 net.cpp:267] TEST Top shape for layer 19 'res3a_branch2a/relu' 8 128 40 96 (3932160) I1106 16:38:03.200668 13504 layer_factory.hpp:172] Creating layer 'res3a_branch2b' of type 'Convolution' I1106 16:38:03.200675 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.200688 13504 net.cpp:200] Created Layer res3a_branch2b (20) I1106 16:38:03.200691 13504 net.cpp:572] res3a_branch2b <- res3a_branch2a I1106 16:38:03.200695 13504 net.cpp:542] res3a_branch2b -> res3a_branch2b I1106 16:38:03.201093 13504 net.cpp:260] Setting up res3a_branch2b I1106 16:38:03.201099 13504 net.cpp:267] TEST Top shape for layer 20 'res3a_branch2b' 8 128 40 96 (3932160) I1106 16:38:03.201104 13504 layer_factory.hpp:172] Creating layer 'res3a_branch2b/bn' of type 'BatchNorm' I1106 16:38:03.201107 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.201118 13504 net.cpp:200] Created Layer res3a_branch2b/bn (21) I1106 16:38:03.201126 13504 net.cpp:572] res3a_branch2b/bn <- res3a_branch2b I1106 16:38:03.201131 13504 net.cpp:527] res3a_branch2b/bn -> res3a_branch2b (in-place) I1106 16:38:03.201318 13504 net.cpp:260] Setting up res3a_branch2b/bn I1106 16:38:03.201324 13504 net.cpp:267] TEST Top shape for layer 21 'res3a_branch2b/bn' 8 128 40 96 (3932160) I1106 16:38:03.201329 13504 layer_factory.hpp:172] Creating layer 'res3a_branch2b/relu' of type 'ReLU' I1106 16:38:03.201337 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.201344 13504 net.cpp:200] Created Layer res3a_branch2b/relu (22) I1106 16:38:03.201350 13504 net.cpp:572] res3a_branch2b/relu <- res3a_branch2b I1106 16:38:03.201356 13504 net.cpp:527] res3a_branch2b/relu -> res3a_branch2b (in-place) I1106 16:38:03.201362 13504 net.cpp:260] Setting up res3a_branch2b/relu I1106 16:38:03.201365 13504 net.cpp:267] TEST Top shape for layer 22 'res3a_branch2b/relu' 8 128 40 96 (3932160) I1106 16:38:03.201369 13504 layer_factory.hpp:172] Creating layer 'pool3' of type 'Pooling' I1106 16:38:03.201371 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.201375 13504 net.cpp:200] Created Layer pool3 (23) I1106 16:38:03.201378 13504 net.cpp:572] pool3 <- res3a_branch2b I1106 16:38:03.201380 13504 net.cpp:542] pool3 -> pool3 I1106 16:38:03.201413 13504 net.cpp:260] Setting up pool3 I1106 16:38:03.201418 13504 net.cpp:267] TEST Top shape for layer 23 'pool3' 8 128 20 48 (983040) I1106 16:38:03.201421 13504 layer_factory.hpp:172] Creating layer 'res4a_branch2a' of type 'Convolution' I1106 16:38:03.201424 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.201434 13504 net.cpp:200] Created Layer res4a_branch2a (24) I1106 16:38:03.201437 13504 net.cpp:572] res4a_branch2a <- pool3 I1106 16:38:03.201440 13504 net.cpp:542] res4a_branch2a -> res4a_branch2a I1106 16:38:03.203651 13504 net.cpp:260] Setting up res4a_branch2a I1106 16:38:03.203673 13504 net.cpp:267] TEST Top shape for layer 24 'res4a_branch2a' 8 256 20 48 (1966080) I1106 16:38:03.203698 13504 layer_factory.hpp:172] Creating layer 'res4a_branch2a/bn' of type 'BatchNorm' I1106 16:38:03.203711 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.203722 13504 net.cpp:200] Created Layer res4a_branch2a/bn (25) I1106 16:38:03.203728 13504 net.cpp:572] res4a_branch2a/bn <- res4a_branch2a I1106 16:38:03.203735 13504 net.cpp:527] res4a_branch2a/bn -> res4a_branch2a (in-place) I1106 16:38:03.203943 13504 net.cpp:260] Setting up res4a_branch2a/bn I1106 16:38:03.203953 13504 net.cpp:267] TEST Top shape for layer 25 'res4a_branch2a/bn' 8 256 20 48 (1966080) I1106 16:38:03.203969 13504 layer_factory.hpp:172] Creating layer 'res4a_branch2a/relu' of type 'ReLU' I1106 16:38:03.203984 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.203991 13504 net.cpp:200] Created Layer res4a_branch2a/relu (26) I1106 16:38:03.203997 13504 net.cpp:572] res4a_branch2a/relu <- res4a_branch2a I1106 16:38:03.204003 13504 net.cpp:527] res4a_branch2a/relu -> res4a_branch2a (in-place) I1106 16:38:03.204012 13504 net.cpp:260] Setting up res4a_branch2a/relu I1106 16:38:03.204020 13504 net.cpp:267] TEST Top shape for layer 26 'res4a_branch2a/relu' 8 256 20 48 (1966080) I1106 16:38:03.204025 13504 layer_factory.hpp:172] Creating layer 'res4a_branch2b' of type 'Convolution' I1106 16:38:03.204030 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.204047 13504 net.cpp:200] Created Layer res4a_branch2b (27) I1106 16:38:03.204054 13504 net.cpp:572] res4a_branch2b <- res4a_branch2a I1106 16:38:03.204061 13504 net.cpp:542] res4a_branch2b -> res4a_branch2b I1106 16:38:03.205796 13504 net.cpp:260] Setting up res4a_branch2b I1106 16:38:03.205807 13504 net.cpp:267] TEST Top shape for layer 27 'res4a_branch2b' 8 256 20 48 (1966080) I1106 16:38:03.205813 13504 layer_factory.hpp:172] Creating layer 'res4a_branch2b/bn' of type 'BatchNorm' I1106 16:38:03.205824 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.205835 13504 net.cpp:200] Created Layer res4a_branch2b/bn (28) I1106 16:38:03.205838 13504 net.cpp:572] res4a_branch2b/bn <- res4a_branch2b I1106 16:38:03.205842 13504 net.cpp:527] res4a_branch2b/bn -> res4a_branch2b (in-place) I1106 16:38:03.206039 13504 net.cpp:260] Setting up res4a_branch2b/bn I1106 16:38:03.206045 13504 net.cpp:267] TEST Top shape for layer 28 'res4a_branch2b/bn' 8 256 20 48 (1966080) I1106 16:38:03.206051 13504 layer_factory.hpp:172] Creating layer 'res4a_branch2b/relu' of type 'ReLU' I1106 16:38:03.206054 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.206058 13504 net.cpp:200] Created Layer res4a_branch2b/relu (29) I1106 16:38:03.206059 13504 net.cpp:572] res4a_branch2b/relu <- res4a_branch2b I1106 16:38:03.206061 13504 net.cpp:527] res4a_branch2b/relu -> res4a_branch2b (in-place) I1106 16:38:03.206066 13504 net.cpp:260] Setting up res4a_branch2b/relu I1106 16:38:03.206070 13504 net.cpp:267] TEST Top shape for layer 29 'res4a_branch2b/relu' 8 256 20 48 (1966080) I1106 16:38:03.206073 13504 layer_factory.hpp:172] Creating layer 'res4a_branch2b_res4a_branch2b/relu_0_split' of type 'Split' I1106 16:38:03.206075 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.206079 13504 net.cpp:200] Created Layer res4a_branch2b_res4a_branch2b/relu_0_split (30) I1106 16:38:03.206080 13504 net.cpp:572] res4a_branch2b_res4a_branch2b/relu_0_split <- res4a_branch2b I1106 16:38:03.206082 13504 net.cpp:542] res4a_branch2b_res4a_branch2b/relu_0_split -> res4a_branch2b_res4a_branch2b/relu_0_split_0 I1106 16:38:03.206087 13504 net.cpp:542] res4a_branch2b_res4a_branch2b/relu_0_split -> res4a_branch2b_res4a_branch2b/relu_0_split_1 I1106 16:38:03.206110 13504 net.cpp:260] Setting up res4a_branch2b_res4a_branch2b/relu_0_split I1106 16:38:03.206120 13504 net.cpp:267] TEST Top shape for layer 30 'res4a_branch2b_res4a_branch2b/relu_0_split' 8 256 20 48 (1966080) I1106 16:38:03.206126 13504 net.cpp:267] TEST Top shape for layer 30 'res4a_branch2b_res4a_branch2b/relu_0_split' 8 256 20 48 (1966080) I1106 16:38:03.206132 13504 layer_factory.hpp:172] Creating layer 'pool4' of type 'Pooling' I1106 16:38:03.206136 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.206141 13504 net.cpp:200] Created Layer pool4 (31) I1106 16:38:03.206144 13504 net.cpp:572] pool4 <- res4a_branch2b_res4a_branch2b/relu_0_split_0 I1106 16:38:03.206147 13504 net.cpp:542] pool4 -> pool4 I1106 16:38:03.206187 13504 net.cpp:260] Setting up pool4 I1106 16:38:03.206195 13504 net.cpp:267] TEST Top shape for layer 31 'pool4' 8 256 10 24 (491520) I1106 16:38:03.206209 13504 layer_factory.hpp:172] Creating layer 'res5a_branch2a' of type 'Convolution' I1106 16:38:03.206215 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.206235 13504 net.cpp:200] Created Layer res5a_branch2a (32) I1106 16:38:03.206243 13504 net.cpp:572] res5a_branch2a <- pool4 I1106 16:38:03.206248 13504 net.cpp:542] res5a_branch2a -> res5a_branch2a I1106 16:38:03.215463 13504 net.cpp:260] Setting up res5a_branch2a I1106 16:38:03.215502 13504 net.cpp:267] TEST Top shape for layer 32 'res5a_branch2a' 8 512 10 24 (983040) I1106 16:38:03.215513 13504 layer_factory.hpp:172] Creating layer 'res5a_branch2a/bn' of type 'BatchNorm' I1106 16:38:03.215523 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.215534 13504 net.cpp:200] Created Layer res5a_branch2a/bn (33) I1106 16:38:03.215543 13504 net.cpp:572] res5a_branch2a/bn <- res5a_branch2a I1106 16:38:03.215549 13504 net.cpp:527] res5a_branch2a/bn -> res5a_branch2a (in-place) I1106 16:38:03.215764 13504 net.cpp:260] Setting up res5a_branch2a/bn I1106 16:38:03.215770 13504 net.cpp:267] TEST Top shape for layer 33 'res5a_branch2a/bn' 8 512 10 24 (983040) I1106 16:38:03.215776 13504 layer_factory.hpp:172] Creating layer 'res5a_branch2a/relu' of type 'ReLU' I1106 16:38:03.215778 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.215782 13504 net.cpp:200] Created Layer res5a_branch2a/relu (34) I1106 16:38:03.215785 13504 net.cpp:572] res5a_branch2a/relu <- res5a_branch2a I1106 16:38:03.215787 13504 net.cpp:527] res5a_branch2a/relu -> res5a_branch2a (in-place) I1106 16:38:03.215791 13504 net.cpp:260] Setting up res5a_branch2a/relu I1106 16:38:03.215795 13504 net.cpp:267] TEST Top shape for layer 34 'res5a_branch2a/relu' 8 512 10 24 (983040) I1106 16:38:03.215797 13504 layer_factory.hpp:172] Creating layer 'res5a_branch2b' of type 'Convolution' I1106 16:38:03.215800 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.215809 13504 net.cpp:200] Created Layer res5a_branch2b (35) I1106 16:38:03.215817 13504 net.cpp:572] res5a_branch2b <- res5a_branch2a I1106 16:38:03.215824 13504 net.cpp:542] res5a_branch2b -> res5a_branch2b I1106 16:38:03.220901 13504 net.cpp:260] Setting up res5a_branch2b I1106 16:38:03.220975 13504 net.cpp:267] TEST Top shape for layer 35 'res5a_branch2b' 8 512 10 24 (983040) I1106 16:38:03.221010 13504 layer_factory.hpp:172] Creating layer 'res5a_branch2b/bn' of type 'BatchNorm' I1106 16:38:03.221019 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.221032 13504 net.cpp:200] Created Layer res5a_branch2b/bn (36) I1106 16:38:03.221040 13504 net.cpp:572] res5a_branch2b/bn <- res5a_branch2b I1106 16:38:03.221048 13504 net.cpp:527] res5a_branch2b/bn -> res5a_branch2b (in-place) I1106 16:38:03.221256 13504 net.cpp:260] Setting up res5a_branch2b/bn I1106 16:38:03.221262 13504 net.cpp:267] TEST Top shape for layer 36 'res5a_branch2b/bn' 8 512 10 24 (983040) I1106 16:38:03.221268 13504 layer_factory.hpp:172] Creating layer 'res5a_branch2b/relu' of type 'ReLU' I1106 16:38:03.221271 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.221276 13504 net.cpp:200] Created Layer res5a_branch2b/relu (37) I1106 16:38:03.221278 13504 net.cpp:572] res5a_branch2b/relu <- res5a_branch2b I1106 16:38:03.221280 13504 net.cpp:527] res5a_branch2b/relu -> res5a_branch2b (in-place) I1106 16:38:03.221284 13504 net.cpp:260] Setting up res5a_branch2b/relu I1106 16:38:03.221287 13504 net.cpp:267] TEST Top shape for layer 37 'res5a_branch2b/relu' 8 512 10 24 (983040) I1106 16:38:03.221290 13504 layer_factory.hpp:172] Creating layer 'res5a_branch2b_res5a_branch2b/relu_0_split' of type 'Split' I1106 16:38:03.221292 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.221315 13504 net.cpp:200] Created Layer res5a_branch2b_res5a_branch2b/relu_0_split (38) I1106 16:38:03.221321 13504 net.cpp:572] res5a_branch2b_res5a_branch2b/relu_0_split <- res5a_branch2b I1106 16:38:03.221328 13504 net.cpp:542] res5a_branch2b_res5a_branch2b/relu_0_split -> res5a_branch2b_res5a_branch2b/relu_0_split_0 I1106 16:38:03.221334 13504 net.cpp:542] res5a_branch2b_res5a_branch2b/relu_0_split -> res5a_branch2b_res5a_branch2b/relu_0_split_1 I1106 16:38:03.221364 13504 net.cpp:260] Setting up res5a_branch2b_res5a_branch2b/relu_0_split I1106 16:38:03.221372 13504 net.cpp:267] TEST Top shape for layer 38 'res5a_branch2b_res5a_branch2b/relu_0_split' 8 512 10 24 (983040) I1106 16:38:03.221379 13504 net.cpp:267] TEST Top shape for layer 38 'res5a_branch2b_res5a_branch2b/relu_0_split' 8 512 10 24 (983040) I1106 16:38:03.221385 13504 layer_factory.hpp:172] Creating layer 'pool6' of type 'Pooling' I1106 16:38:03.221390 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.221398 13504 net.cpp:200] Created Layer pool6 (39) I1106 16:38:03.221405 13504 net.cpp:572] pool6 <- res5a_branch2b_res5a_branch2b/relu_0_split_0 I1106 16:38:03.221410 13504 net.cpp:542] pool6 -> pool6 I1106 16:38:03.221446 13504 net.cpp:260] Setting up pool6 I1106 16:38:03.221453 13504 net.cpp:267] TEST Top shape for layer 39 'pool6' 8 512 5 12 (245760) I1106 16:38:03.221459 13504 layer_factory.hpp:172] Creating layer 'pool6_pool6_0_split' of type 'Split' I1106 16:38:03.221464 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.221472 13504 net.cpp:200] Created Layer pool6_pool6_0_split (40) I1106 16:38:03.221477 13504 net.cpp:572] pool6_pool6_0_split <- pool6 I1106 16:38:03.221483 13504 net.cpp:542] pool6_pool6_0_split -> pool6_pool6_0_split_0 I1106 16:38:03.221489 13504 net.cpp:542] pool6_pool6_0_split -> pool6_pool6_0_split_1 I1106 16:38:03.221513 13504 net.cpp:260] Setting up pool6_pool6_0_split I1106 16:38:03.221521 13504 net.cpp:267] TEST Top shape for layer 40 'pool6_pool6_0_split' 8 512 5 12 (245760) I1106 16:38:03.221527 13504 net.cpp:267] TEST Top shape for layer 40 'pool6_pool6_0_split' 8 512 5 12 (245760) I1106 16:38:03.221532 13504 layer_factory.hpp:172] Creating layer 'pool7' of type 'Pooling' I1106 16:38:03.221539 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.221545 13504 net.cpp:200] Created Layer pool7 (41) I1106 16:38:03.221550 13504 net.cpp:572] pool7 <- pool6_pool6_0_split_0 I1106 16:38:03.221556 13504 net.cpp:542] pool7 -> pool7 I1106 16:38:03.221585 13504 net.cpp:260] Setting up pool7 I1106 16:38:03.221593 13504 net.cpp:267] TEST Top shape for layer 41 'pool7' 8 512 3 6 (73728) I1106 16:38:03.221599 13504 layer_factory.hpp:172] Creating layer 'pool7_pool7_0_split' of type 'Split' I1106 16:38:03.221604 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.221612 13504 net.cpp:200] Created Layer pool7_pool7_0_split (42) I1106 16:38:03.221617 13504 net.cpp:572] pool7_pool7_0_split <- pool7 I1106 16:38:03.221623 13504 net.cpp:542] pool7_pool7_0_split -> pool7_pool7_0_split_0 I1106 16:38:03.221629 13504 net.cpp:542] pool7_pool7_0_split -> pool7_pool7_0_split_1 I1106 16:38:03.221652 13504 net.cpp:260] Setting up pool7_pool7_0_split I1106 16:38:03.221660 13504 net.cpp:267] TEST Top shape for layer 42 'pool7_pool7_0_split' 8 512 3 6 (73728) I1106 16:38:03.221666 13504 net.cpp:267] TEST Top shape for layer 42 'pool7_pool7_0_split' 8 512 3 6 (73728) I1106 16:38:03.221671 13504 layer_factory.hpp:172] Creating layer 'pool8' of type 'Pooling' I1106 16:38:03.221676 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.221683 13504 net.cpp:200] Created Layer pool8 (43) I1106 16:38:03.221689 13504 net.cpp:572] pool8 <- pool7_pool7_0_split_0 I1106 16:38:03.221698 13504 net.cpp:542] pool8 -> pool8 I1106 16:38:03.221734 13504 net.cpp:260] Setting up pool8 I1106 16:38:03.221742 13504 net.cpp:267] TEST Top shape for layer 43 'pool8' 8 512 2 3 (24576) I1106 16:38:03.221748 13504 layer_factory.hpp:172] Creating layer 'ctx_output1' of type 'Convolution' I1106 16:38:03.221753 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.221767 13504 net.cpp:200] Created Layer ctx_output1 (44) I1106 16:38:03.221774 13504 net.cpp:572] ctx_output1 <- res4a_branch2b_res4a_branch2b/relu_0_split_1 I1106 16:38:03.221781 13504 net.cpp:542] ctx_output1 -> ctx_output1 I1106 16:38:03.222389 13504 net.cpp:260] Setting up ctx_output1 I1106 16:38:03.222402 13504 net.cpp:267] TEST Top shape for layer 44 'ctx_output1' 8 256 20 48 (1966080) I1106 16:38:03.222410 13504 layer_factory.hpp:172] Creating layer 'ctx_output1/relu' of type 'ReLU' I1106 16:38:03.222416 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.222424 13504 net.cpp:200] Created Layer ctx_output1/relu (45) I1106 16:38:03.222430 13504 net.cpp:572] ctx_output1/relu <- ctx_output1 I1106 16:38:03.222435 13504 net.cpp:527] ctx_output1/relu -> ctx_output1 (in-place) I1106 16:38:03.222443 13504 net.cpp:260] Setting up ctx_output1/relu I1106 16:38:03.222450 13504 net.cpp:267] TEST Top shape for layer 45 'ctx_output1/relu' 8 256 20 48 (1966080) I1106 16:38:03.222455 13504 layer_factory.hpp:172] Creating layer 'ctx_output1_ctx_output1/relu_0_split' of type 'Split' I1106 16:38:03.222460 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.222467 13504 net.cpp:200] Created Layer ctx_output1_ctx_output1/relu_0_split (46) I1106 16:38:03.222472 13504 net.cpp:572] ctx_output1_ctx_output1/relu_0_split <- ctx_output1 I1106 16:38:03.222478 13504 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_0 I1106 16:38:03.222484 13504 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_1 I1106 16:38:03.222492 13504 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_2 I1106 16:38:03.222529 13504 net.cpp:260] Setting up ctx_output1_ctx_output1/relu_0_split I1106 16:38:03.222537 13504 net.cpp:267] TEST Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 8 256 20 48 (1966080) I1106 16:38:03.222543 13504 net.cpp:267] TEST Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 8 256 20 48 (1966080) I1106 16:38:03.222549 13504 net.cpp:267] TEST Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 8 256 20 48 (1966080) I1106 16:38:03.222555 13504 layer_factory.hpp:172] Creating layer 'ctx_output2' of type 'Convolution' I1106 16:38:03.222560 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.222577 13504 net.cpp:200] Created Layer ctx_output2 (47) I1106 16:38:03.222584 13504 net.cpp:572] ctx_output2 <- res5a_branch2b_res5a_branch2b/relu_0_split_1 I1106 16:38:03.222590 13504 net.cpp:542] ctx_output2 -> ctx_output2 I1106 16:38:03.223644 13504 net.cpp:260] Setting up ctx_output2 I1106 16:38:03.223650 13504 net.cpp:267] TEST Top shape for layer 47 'ctx_output2' 8 256 10 24 (491520) I1106 16:38:03.223655 13504 layer_factory.hpp:172] Creating layer 'ctx_output2/relu' of type 'ReLU' I1106 16:38:03.223657 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.223661 13504 net.cpp:200] Created Layer ctx_output2/relu (48) I1106 16:38:03.223670 13504 net.cpp:572] ctx_output2/relu <- ctx_output2 I1106 16:38:03.223675 13504 net.cpp:527] ctx_output2/relu -> ctx_output2 (in-place) I1106 16:38:03.223686 13504 net.cpp:260] Setting up ctx_output2/relu I1106 16:38:03.223695 13504 net.cpp:267] TEST Top shape for layer 48 'ctx_output2/relu' 8 256 10 24 (491520) I1106 16:38:03.223701 13504 layer_factory.hpp:172] Creating layer 'ctx_output2_ctx_output2/relu_0_split' of type 'Split' I1106 16:38:03.223712 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.223727 13504 net.cpp:200] Created Layer ctx_output2_ctx_output2/relu_0_split (49) I1106 16:38:03.223734 13504 net.cpp:572] ctx_output2_ctx_output2/relu_0_split <- ctx_output2 I1106 16:38:03.223742 13504 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_0 I1106 16:38:03.223747 13504 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_1 I1106 16:38:03.223750 13504 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_2 I1106 16:38:03.223790 13504 net.cpp:260] Setting up ctx_output2_ctx_output2/relu_0_split I1106 16:38:03.223795 13504 net.cpp:267] TEST Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 8 256 10 24 (491520) I1106 16:38:03.223803 13504 net.cpp:267] TEST Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 8 256 10 24 (491520) I1106 16:38:03.223810 13504 net.cpp:267] TEST Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 8 256 10 24 (491520) I1106 16:38:03.223817 13504 layer_factory.hpp:172] Creating layer 'ctx_output3' of type 'Convolution' I1106 16:38:03.223824 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.223839 13504 net.cpp:200] Created Layer ctx_output3 (50) I1106 16:38:03.223842 13504 net.cpp:572] ctx_output3 <- pool6_pool6_0_split_1 I1106 16:38:03.223845 13504 net.cpp:542] ctx_output3 -> ctx_output3 I1106 16:38:03.224970 13504 net.cpp:260] Setting up ctx_output3 I1106 16:38:03.224978 13504 net.cpp:267] TEST Top shape for layer 50 'ctx_output3' 8 256 5 12 (122880) I1106 16:38:03.224983 13504 layer_factory.hpp:172] Creating layer 'ctx_output3/relu' of type 'ReLU' I1106 16:38:03.224985 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.224989 13504 net.cpp:200] Created Layer ctx_output3/relu (51) I1106 16:38:03.224992 13504 net.cpp:572] ctx_output3/relu <- ctx_output3 I1106 16:38:03.224995 13504 net.cpp:527] ctx_output3/relu -> ctx_output3 (in-place) I1106 16:38:03.224999 13504 net.cpp:260] Setting up ctx_output3/relu I1106 16:38:03.225003 13504 net.cpp:267] TEST Top shape for layer 51 'ctx_output3/relu' 8 256 5 12 (122880) I1106 16:38:03.225004 13504 layer_factory.hpp:172] Creating layer 'ctx_output3_ctx_output3/relu_0_split' of type 'Split' I1106 16:38:03.225008 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.225013 13504 net.cpp:200] Created Layer ctx_output3_ctx_output3/relu_0_split (52) I1106 16:38:03.225013 13504 net.cpp:572] ctx_output3_ctx_output3/relu_0_split <- ctx_output3 I1106 16:38:03.225018 13504 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_0 I1106 16:38:03.225021 13504 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_1 I1106 16:38:03.225026 13504 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_2 I1106 16:38:03.225065 13504 net.cpp:260] Setting up ctx_output3_ctx_output3/relu_0_split I1106 16:38:03.225070 13504 net.cpp:267] TEST Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 8 256 5 12 (122880) I1106 16:38:03.225073 13504 net.cpp:267] TEST Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 8 256 5 12 (122880) I1106 16:38:03.225075 13504 net.cpp:267] TEST Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 8 256 5 12 (122880) I1106 16:38:03.225077 13504 layer_factory.hpp:172] Creating layer 'ctx_output4' of type 'Convolution' I1106 16:38:03.225081 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.225092 13504 net.cpp:200] Created Layer ctx_output4 (53) I1106 16:38:03.225095 13504 net.cpp:572] ctx_output4 <- pool7_pool7_0_split_1 I1106 16:38:03.225098 13504 net.cpp:542] ctx_output4 -> ctx_output4 I1106 16:38:03.226842 13504 net.cpp:260] Setting up ctx_output4 I1106 16:38:03.226873 13504 net.cpp:267] TEST Top shape for layer 53 'ctx_output4' 8 256 3 6 (36864) I1106 16:38:03.226881 13504 layer_factory.hpp:172] Creating layer 'ctx_output4/relu' of type 'ReLU' I1106 16:38:03.226883 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.226888 13504 net.cpp:200] Created Layer ctx_output4/relu (54) I1106 16:38:03.226892 13504 net.cpp:572] ctx_output4/relu <- ctx_output4 I1106 16:38:03.226897 13504 net.cpp:527] ctx_output4/relu -> ctx_output4 (in-place) I1106 16:38:03.226902 13504 net.cpp:260] Setting up ctx_output4/relu I1106 16:38:03.226905 13504 net.cpp:267] TEST Top shape for layer 54 'ctx_output4/relu' 8 256 3 6 (36864) I1106 16:38:03.226908 13504 layer_factory.hpp:172] Creating layer 'ctx_output4_ctx_output4/relu_0_split' of type 'Split' I1106 16:38:03.226912 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.226915 13504 net.cpp:200] Created Layer ctx_output4_ctx_output4/relu_0_split (55) I1106 16:38:03.226927 13504 net.cpp:572] ctx_output4_ctx_output4/relu_0_split <- ctx_output4 I1106 16:38:03.226961 13504 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_0 I1106 16:38:03.226972 13504 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_1 I1106 16:38:03.226976 13504 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_2 I1106 16:38:03.227010 13504 net.cpp:260] Setting up ctx_output4_ctx_output4/relu_0_split I1106 16:38:03.227015 13504 net.cpp:267] TEST Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 8 256 3 6 (36864) I1106 16:38:03.227018 13504 net.cpp:267] TEST Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 8 256 3 6 (36864) I1106 16:38:03.227021 13504 net.cpp:267] TEST Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 8 256 3 6 (36864) I1106 16:38:03.227025 13504 layer_factory.hpp:172] Creating layer 'ctx_output5' of type 'Convolution' I1106 16:38:03.227028 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.227038 13504 net.cpp:200] Created Layer ctx_output5 (56) I1106 16:38:03.227042 13504 net.cpp:572] ctx_output5 <- pool8 I1106 16:38:03.227046 13504 net.cpp:542] ctx_output5 -> ctx_output5 I1106 16:38:03.228180 13504 net.cpp:260] Setting up ctx_output5 I1106 16:38:03.228206 13504 net.cpp:267] TEST Top shape for layer 56 'ctx_output5' 8 256 2 3 (12288) I1106 16:38:03.228219 13504 layer_factory.hpp:172] Creating layer 'ctx_output5/relu' of type 'ReLU' I1106 16:38:03.228226 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.228235 13504 net.cpp:200] Created Layer ctx_output5/relu (57) I1106 16:38:03.228240 13504 net.cpp:572] ctx_output5/relu <- ctx_output5 I1106 16:38:03.228247 13504 net.cpp:527] ctx_output5/relu -> ctx_output5 (in-place) I1106 16:38:03.228255 13504 net.cpp:260] Setting up ctx_output5/relu I1106 16:38:03.228265 13504 net.cpp:267] TEST Top shape for layer 57 'ctx_output5/relu' 8 256 2 3 (12288) I1106 16:38:03.228269 13504 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc' of type 'Convolution' I1106 16:38:03.228276 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.228291 13504 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc (58) I1106 16:38:03.228298 13504 net.cpp:572] ctx_output1/relu_mbox_loc <- ctx_output1_ctx_output1/relu_0_split_0 I1106 16:38:03.228305 13504 net.cpp:542] ctx_output1/relu_mbox_loc -> ctx_output1/relu_mbox_loc I1106 16:38:03.228529 13504 net.cpp:260] Setting up ctx_output1/relu_mbox_loc I1106 16:38:03.228543 13504 net.cpp:267] TEST Top shape for layer 58 'ctx_output1/relu_mbox_loc' 8 16 20 48 (122880) I1106 16:38:03.228551 13504 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:03.228569 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.228600 13504 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc_perm (59) I1106 16:38:03.228606 13504 net.cpp:572] ctx_output1/relu_mbox_loc_perm <- ctx_output1/relu_mbox_loc I1106 16:38:03.228612 13504 net.cpp:542] ctx_output1/relu_mbox_loc_perm -> ctx_output1/relu_mbox_loc_perm I1106 16:38:03.228698 13504 net.cpp:260] Setting up ctx_output1/relu_mbox_loc_perm I1106 16:38:03.228706 13504 net.cpp:267] TEST Top shape for layer 59 'ctx_output1/relu_mbox_loc_perm' 8 20 48 16 (122880) I1106 16:38:03.228713 13504 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:03.228719 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.228729 13504 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc_flat (60) I1106 16:38:03.228735 13504 net.cpp:572] ctx_output1/relu_mbox_loc_flat <- ctx_output1/relu_mbox_loc_perm I1106 16:38:03.228741 13504 net.cpp:542] ctx_output1/relu_mbox_loc_flat -> ctx_output1/relu_mbox_loc_flat I1106 16:38:03.228963 13504 net.cpp:260] Setting up ctx_output1/relu_mbox_loc_flat I1106 16:38:03.228976 13504 net.cpp:267] TEST Top shape for layer 60 'ctx_output1/relu_mbox_loc_flat' 8 15360 (122880) I1106 16:38:03.228981 13504 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf' of type 'Convolution' I1106 16:38:03.228987 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.229002 13504 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf (61) I1106 16:38:03.229009 13504 net.cpp:572] ctx_output1/relu_mbox_conf <- ctx_output1_ctx_output1/relu_0_split_1 I1106 16:38:03.229017 13504 net.cpp:542] ctx_output1/relu_mbox_conf -> ctx_output1/relu_mbox_conf I1106 16:38:03.230139 13504 net.cpp:260] Setting up ctx_output1/relu_mbox_conf I1106 16:38:03.230161 13504 net.cpp:267] TEST Top shape for layer 61 'ctx_output1/relu_mbox_conf' 8 8 20 48 (61440) I1106 16:38:03.230170 13504 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:03.230175 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.230185 13504 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf_perm (62) I1106 16:38:03.230188 13504 net.cpp:572] ctx_output1/relu_mbox_conf_perm <- ctx_output1/relu_mbox_conf I1106 16:38:03.230193 13504 net.cpp:542] ctx_output1/relu_mbox_conf_perm -> ctx_output1/relu_mbox_conf_perm I1106 16:38:03.230285 13504 net.cpp:260] Setting up ctx_output1/relu_mbox_conf_perm I1106 16:38:03.230290 13504 net.cpp:267] TEST Top shape for layer 62 'ctx_output1/relu_mbox_conf_perm' 8 20 48 8 (61440) I1106 16:38:03.230293 13504 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:03.230296 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.230301 13504 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf_flat (63) I1106 16:38:03.230304 13504 net.cpp:572] ctx_output1/relu_mbox_conf_flat <- ctx_output1/relu_mbox_conf_perm I1106 16:38:03.230307 13504 net.cpp:542] ctx_output1/relu_mbox_conf_flat -> ctx_output1/relu_mbox_conf_flat I1106 16:38:03.230368 13504 net.cpp:260] Setting up ctx_output1/relu_mbox_conf_flat I1106 16:38:03.230374 13504 net.cpp:267] TEST Top shape for layer 63 'ctx_output1/relu_mbox_conf_flat' 8 7680 (61440) I1106 16:38:03.230378 13504 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:03.230381 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.230391 13504 net.cpp:200] Created Layer ctx_output1/relu_mbox_priorbox (64) I1106 16:38:03.230396 13504 net.cpp:572] ctx_output1/relu_mbox_priorbox <- ctx_output1_ctx_output1/relu_0_split_2 I1106 16:38:03.230399 13504 net.cpp:572] ctx_output1/relu_mbox_priorbox <- data_data_0_split_1 I1106 16:38:03.230403 13504 net.cpp:542] ctx_output1/relu_mbox_priorbox -> ctx_output1/relu_mbox_priorbox I1106 16:38:03.230433 13504 net.cpp:260] Setting up ctx_output1/relu_mbox_priorbox I1106 16:38:03.230437 13504 net.cpp:267] TEST Top shape for layer 64 'ctx_output1/relu_mbox_priorbox' 1 2 15360 (30720) I1106 16:38:03.230442 13504 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc' of type 'Convolution' I1106 16:38:03.230444 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.230454 13504 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc (65) I1106 16:38:03.230458 13504 net.cpp:572] ctx_output2/relu_mbox_loc <- ctx_output2_ctx_output2/relu_0_split_0 I1106 16:38:03.230463 13504 net.cpp:542] ctx_output2/relu_mbox_loc -> ctx_output2/relu_mbox_loc I1106 16:38:03.230671 13504 net.cpp:260] Setting up ctx_output2/relu_mbox_loc I1106 16:38:03.230677 13504 net.cpp:267] TEST Top shape for layer 65 'ctx_output2/relu_mbox_loc' 8 24 10 24 (46080) I1106 16:38:03.230682 13504 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:03.230685 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.230691 13504 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc_perm (66) I1106 16:38:03.230695 13504 net.cpp:572] ctx_output2/relu_mbox_loc_perm <- ctx_output2/relu_mbox_loc I1106 16:38:03.230697 13504 net.cpp:542] ctx_output2/relu_mbox_loc_perm -> ctx_output2/relu_mbox_loc_perm I1106 16:38:03.230754 13504 net.cpp:260] Setting up ctx_output2/relu_mbox_loc_perm I1106 16:38:03.230758 13504 net.cpp:267] TEST Top shape for layer 66 'ctx_output2/relu_mbox_loc_perm' 8 10 24 24 (46080) I1106 16:38:03.230762 13504 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:03.230764 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.230769 13504 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc_flat (67) I1106 16:38:03.230772 13504 net.cpp:572] ctx_output2/relu_mbox_loc_flat <- ctx_output2/relu_mbox_loc_perm I1106 16:38:03.230774 13504 net.cpp:542] ctx_output2/relu_mbox_loc_flat -> ctx_output2/relu_mbox_loc_flat I1106 16:38:03.231616 13504 net.cpp:260] Setting up ctx_output2/relu_mbox_loc_flat I1106 16:38:03.231626 13504 net.cpp:267] TEST Top shape for layer 67 'ctx_output2/relu_mbox_loc_flat' 8 5760 (46080) I1106 16:38:03.231628 13504 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf' of type 'Convolution' I1106 16:38:03.231631 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.231642 13504 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf (68) I1106 16:38:03.231645 13504 net.cpp:572] ctx_output2/relu_mbox_conf <- ctx_output2_ctx_output2/relu_0_split_1 I1106 16:38:03.231649 13504 net.cpp:542] ctx_output2/relu_mbox_conf -> ctx_output2/relu_mbox_conf I1106 16:38:03.231874 13504 net.cpp:260] Setting up ctx_output2/relu_mbox_conf I1106 16:38:03.231880 13504 net.cpp:267] TEST Top shape for layer 68 'ctx_output2/relu_mbox_conf' 8 12 10 24 (23040) I1106 16:38:03.231886 13504 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:03.231889 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.231896 13504 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf_perm (69) I1106 16:38:03.231900 13504 net.cpp:572] ctx_output2/relu_mbox_conf_perm <- ctx_output2/relu_mbox_conf I1106 16:38:03.231904 13504 net.cpp:542] ctx_output2/relu_mbox_conf_perm -> ctx_output2/relu_mbox_conf_perm I1106 16:38:03.231958 13504 net.cpp:260] Setting up ctx_output2/relu_mbox_conf_perm I1106 16:38:03.231964 13504 net.cpp:267] TEST Top shape for layer 69 'ctx_output2/relu_mbox_conf_perm' 8 10 24 12 (23040) I1106 16:38:03.231967 13504 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:03.231971 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.231982 13504 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf_flat (70) I1106 16:38:03.231986 13504 net.cpp:572] ctx_output2/relu_mbox_conf_flat <- ctx_output2/relu_mbox_conf_perm I1106 16:38:03.231988 13504 net.cpp:542] ctx_output2/relu_mbox_conf_flat -> ctx_output2/relu_mbox_conf_flat I1106 16:38:03.232645 13504 net.cpp:260] Setting up ctx_output2/relu_mbox_conf_flat I1106 16:38:03.232653 13504 net.cpp:267] TEST Top shape for layer 70 'ctx_output2/relu_mbox_conf_flat' 8 2880 (23040) I1106 16:38:03.232657 13504 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:03.232661 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.232667 13504 net.cpp:200] Created Layer ctx_output2/relu_mbox_priorbox (71) I1106 16:38:03.232671 13504 net.cpp:572] ctx_output2/relu_mbox_priorbox <- ctx_output2_ctx_output2/relu_0_split_2 I1106 16:38:03.232674 13504 net.cpp:572] ctx_output2/relu_mbox_priorbox <- data_data_0_split_2 I1106 16:38:03.232678 13504 net.cpp:542] ctx_output2/relu_mbox_priorbox -> ctx_output2/relu_mbox_priorbox I1106 16:38:03.232695 13504 net.cpp:260] Setting up ctx_output2/relu_mbox_priorbox I1106 16:38:03.232699 13504 net.cpp:267] TEST Top shape for layer 71 'ctx_output2/relu_mbox_priorbox' 1 2 5760 (11520) I1106 16:38:03.232702 13504 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc' of type 'Convolution' I1106 16:38:03.232704 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.232713 13504 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc (72) I1106 16:38:03.232717 13504 net.cpp:572] ctx_output3/relu_mbox_loc <- ctx_output3_ctx_output3/relu_0_split_0 I1106 16:38:03.232719 13504 net.cpp:542] ctx_output3/relu_mbox_loc -> ctx_output3/relu_mbox_loc I1106 16:38:03.232919 13504 net.cpp:260] Setting up ctx_output3/relu_mbox_loc I1106 16:38:03.232925 13504 net.cpp:267] TEST Top shape for layer 72 'ctx_output3/relu_mbox_loc' 8 24 5 12 (11520) I1106 16:38:03.232930 13504 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:03.232933 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.232939 13504 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc_perm (73) I1106 16:38:03.232942 13504 net.cpp:572] ctx_output3/relu_mbox_loc_perm <- ctx_output3/relu_mbox_loc I1106 16:38:03.232945 13504 net.cpp:542] ctx_output3/relu_mbox_loc_perm -> ctx_output3/relu_mbox_loc_perm I1106 16:38:03.233008 13504 net.cpp:260] Setting up ctx_output3/relu_mbox_loc_perm I1106 16:38:03.233013 13504 net.cpp:267] TEST Top shape for layer 73 'ctx_output3/relu_mbox_loc_perm' 8 5 12 24 (11520) I1106 16:38:03.233016 13504 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:03.233019 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.233022 13504 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc_flat (74) I1106 16:38:03.233026 13504 net.cpp:572] ctx_output3/relu_mbox_loc_flat <- ctx_output3/relu_mbox_loc_perm I1106 16:38:03.233028 13504 net.cpp:542] ctx_output3/relu_mbox_loc_flat -> ctx_output3/relu_mbox_loc_flat I1106 16:38:03.233431 13504 net.cpp:260] Setting up ctx_output3/relu_mbox_loc_flat I1106 16:38:03.233438 13504 net.cpp:267] TEST Top shape for layer 74 'ctx_output3/relu_mbox_loc_flat' 8 1440 (11520) I1106 16:38:03.233443 13504 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf' of type 'Convolution' I1106 16:38:03.233445 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.233453 13504 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf (75) I1106 16:38:03.233458 13504 net.cpp:572] ctx_output3/relu_mbox_conf <- ctx_output3_ctx_output3/relu_0_split_1 I1106 16:38:03.233460 13504 net.cpp:542] ctx_output3/relu_mbox_conf -> ctx_output3/relu_mbox_conf I1106 16:38:03.233641 13504 net.cpp:260] Setting up ctx_output3/relu_mbox_conf I1106 16:38:03.233649 13504 net.cpp:267] TEST Top shape for layer 75 'ctx_output3/relu_mbox_conf' 8 12 5 12 (5760) I1106 16:38:03.233652 13504 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:03.233654 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.233664 13504 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf_perm (76) I1106 16:38:03.233665 13504 net.cpp:572] ctx_output3/relu_mbox_conf_perm <- ctx_output3/relu_mbox_conf I1106 16:38:03.233669 13504 net.cpp:542] ctx_output3/relu_mbox_conf_perm -> ctx_output3/relu_mbox_conf_perm I1106 16:38:03.233724 13504 net.cpp:260] Setting up ctx_output3/relu_mbox_conf_perm I1106 16:38:03.233729 13504 net.cpp:267] TEST Top shape for layer 76 'ctx_output3/relu_mbox_conf_perm' 8 5 12 12 (5760) I1106 16:38:03.233731 13504 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:03.233733 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.233737 13504 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf_flat (77) I1106 16:38:03.233741 13504 net.cpp:572] ctx_output3/relu_mbox_conf_flat <- ctx_output3/relu_mbox_conf_perm I1106 16:38:03.233742 13504 net.cpp:542] ctx_output3/relu_mbox_conf_flat -> ctx_output3/relu_mbox_conf_flat I1106 16:38:03.233773 13504 net.cpp:260] Setting up ctx_output3/relu_mbox_conf_flat I1106 16:38:03.233778 13504 net.cpp:267] TEST Top shape for layer 77 'ctx_output3/relu_mbox_conf_flat' 8 720 (5760) I1106 16:38:03.233781 13504 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:03.233784 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.233789 13504 net.cpp:200] Created Layer ctx_output3/relu_mbox_priorbox (78) I1106 16:38:03.233793 13504 net.cpp:572] ctx_output3/relu_mbox_priorbox <- ctx_output3_ctx_output3/relu_0_split_2 I1106 16:38:03.233795 13504 net.cpp:572] ctx_output3/relu_mbox_priorbox <- data_data_0_split_3 I1106 16:38:03.233799 13504 net.cpp:542] ctx_output3/relu_mbox_priorbox -> ctx_output3/relu_mbox_priorbox I1106 16:38:03.233816 13504 net.cpp:260] Setting up ctx_output3/relu_mbox_priorbox I1106 16:38:03.233821 13504 net.cpp:267] TEST Top shape for layer 78 'ctx_output3/relu_mbox_priorbox' 1 2 1440 (2880) I1106 16:38:03.233824 13504 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc' of type 'Convolution' I1106 16:38:03.233826 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.233834 13504 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc (79) I1106 16:38:03.233837 13504 net.cpp:572] ctx_output4/relu_mbox_loc <- ctx_output4_ctx_output4/relu_0_split_0 I1106 16:38:03.233840 13504 net.cpp:542] ctx_output4/relu_mbox_loc -> ctx_output4/relu_mbox_loc I1106 16:38:03.234016 13504 net.cpp:260] Setting up ctx_output4/relu_mbox_loc I1106 16:38:03.234030 13504 net.cpp:267] TEST Top shape for layer 79 'ctx_output4/relu_mbox_loc' 8 16 3 6 (2304) I1106 16:38:03.234040 13504 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:03.234045 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.234055 13504 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc_perm (80) I1106 16:38:03.234061 13504 net.cpp:572] ctx_output4/relu_mbox_loc_perm <- ctx_output4/relu_mbox_loc I1106 16:38:03.234067 13504 net.cpp:542] ctx_output4/relu_mbox_loc_perm -> ctx_output4/relu_mbox_loc_perm I1106 16:38:03.234127 13504 net.cpp:260] Setting up ctx_output4/relu_mbox_loc_perm I1106 16:38:03.234134 13504 net.cpp:267] TEST Top shape for layer 80 'ctx_output4/relu_mbox_loc_perm' 8 3 6 16 (2304) I1106 16:38:03.234143 13504 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:03.234155 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.234163 13504 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc_flat (81) I1106 16:38:03.234169 13504 net.cpp:572] ctx_output4/relu_mbox_loc_flat <- ctx_output4/relu_mbox_loc_perm I1106 16:38:03.234174 13504 net.cpp:542] ctx_output4/relu_mbox_loc_flat -> ctx_output4/relu_mbox_loc_flat I1106 16:38:03.234210 13504 net.cpp:260] Setting up ctx_output4/relu_mbox_loc_flat I1106 16:38:03.234220 13504 net.cpp:267] TEST Top shape for layer 81 'ctx_output4/relu_mbox_loc_flat' 8 288 (2304) I1106 16:38:03.234225 13504 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf' of type 'Convolution' I1106 16:38:03.234231 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.234242 13504 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf (82) I1106 16:38:03.234249 13504 net.cpp:572] ctx_output4/relu_mbox_conf <- ctx_output4_ctx_output4/relu_0_split_1 I1106 16:38:03.234256 13504 net.cpp:542] ctx_output4/relu_mbox_conf -> ctx_output4/relu_mbox_conf I1106 16:38:03.234416 13504 net.cpp:260] Setting up ctx_output4/relu_mbox_conf I1106 16:38:03.234427 13504 net.cpp:267] TEST Top shape for layer 82 'ctx_output4/relu_mbox_conf' 8 8 3 6 (1152) I1106 16:38:03.234436 13504 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:03.234443 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.234453 13504 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf_perm (83) I1106 16:38:03.234460 13504 net.cpp:572] ctx_output4/relu_mbox_conf_perm <- ctx_output4/relu_mbox_conf I1106 16:38:03.234467 13504 net.cpp:542] ctx_output4/relu_mbox_conf_perm -> ctx_output4/relu_mbox_conf_perm I1106 16:38:03.234530 13504 net.cpp:260] Setting up ctx_output4/relu_mbox_conf_perm I1106 16:38:03.234539 13504 net.cpp:267] TEST Top shape for layer 83 'ctx_output4/relu_mbox_conf_perm' 8 3 6 8 (1152) I1106 16:38:03.234546 13504 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:03.234552 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.234560 13504 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf_flat (84) I1106 16:38:03.234566 13504 net.cpp:572] ctx_output4/relu_mbox_conf_flat <- ctx_output4/relu_mbox_conf_perm I1106 16:38:03.234573 13504 net.cpp:542] ctx_output4/relu_mbox_conf_flat -> ctx_output4/relu_mbox_conf_flat I1106 16:38:03.234611 13504 net.cpp:260] Setting up ctx_output4/relu_mbox_conf_flat I1106 16:38:03.234621 13504 net.cpp:267] TEST Top shape for layer 84 'ctx_output4/relu_mbox_conf_flat' 8 144 (1152) I1106 16:38:03.234627 13504 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:03.234633 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.234642 13504 net.cpp:200] Created Layer ctx_output4/relu_mbox_priorbox (85) I1106 16:38:03.234648 13504 net.cpp:572] ctx_output4/relu_mbox_priorbox <- ctx_output4_ctx_output4/relu_0_split_2 I1106 16:38:03.234654 13504 net.cpp:572] ctx_output4/relu_mbox_priorbox <- data_data_0_split_4 I1106 16:38:03.234660 13504 net.cpp:542] ctx_output4/relu_mbox_priorbox -> ctx_output4/relu_mbox_priorbox I1106 16:38:03.234675 13504 net.cpp:260] Setting up ctx_output4/relu_mbox_priorbox I1106 16:38:03.234683 13504 net.cpp:267] TEST Top shape for layer 85 'ctx_output4/relu_mbox_priorbox' 1 2 288 (576) I1106 16:38:03.234688 13504 layer_factory.hpp:172] Creating layer 'mbox_loc' of type 'Concat' I1106 16:38:03.234694 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.234702 13504 net.cpp:200] Created Layer mbox_loc (86) I1106 16:38:03.234707 13504 net.cpp:572] mbox_loc <- ctx_output1/relu_mbox_loc_flat I1106 16:38:03.234716 13504 net.cpp:572] mbox_loc <- ctx_output2/relu_mbox_loc_flat I1106 16:38:03.234727 13504 net.cpp:572] mbox_loc <- ctx_output3/relu_mbox_loc_flat I1106 16:38:03.234735 13504 net.cpp:572] mbox_loc <- ctx_output4/relu_mbox_loc_flat I1106 16:38:03.234740 13504 net.cpp:542] mbox_loc -> mbox_loc I1106 16:38:03.234757 13504 net.cpp:260] Setting up mbox_loc I1106 16:38:03.234764 13504 net.cpp:267] TEST Top shape for layer 86 'mbox_loc' 8 22848 (182784) I1106 16:38:03.234771 13504 layer_factory.hpp:172] Creating layer 'mbox_conf' of type 'Concat' I1106 16:38:03.234776 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.234783 13504 net.cpp:200] Created Layer mbox_conf (87) I1106 16:38:03.234789 13504 net.cpp:572] mbox_conf <- ctx_output1/relu_mbox_conf_flat I1106 16:38:03.234796 13504 net.cpp:572] mbox_conf <- ctx_output2/relu_mbox_conf_flat I1106 16:38:03.234802 13504 net.cpp:572] mbox_conf <- ctx_output3/relu_mbox_conf_flat I1106 16:38:03.234807 13504 net.cpp:572] mbox_conf <- ctx_output4/relu_mbox_conf_flat I1106 16:38:03.234812 13504 net.cpp:542] mbox_conf -> mbox_conf I1106 16:38:03.234828 13504 net.cpp:260] Setting up mbox_conf I1106 16:38:03.234835 13504 net.cpp:267] TEST Top shape for layer 87 'mbox_conf' 8 11424 (91392) I1106 16:38:03.234840 13504 layer_factory.hpp:172] Creating layer 'mbox_priorbox' of type 'Concat' I1106 16:38:03.234845 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.234853 13504 net.cpp:200] Created Layer mbox_priorbox (88) I1106 16:38:03.234859 13504 net.cpp:572] mbox_priorbox <- ctx_output1/relu_mbox_priorbox I1106 16:38:03.234865 13504 net.cpp:572] mbox_priorbox <- ctx_output2/relu_mbox_priorbox I1106 16:38:03.234870 13504 net.cpp:572] mbox_priorbox <- ctx_output3/relu_mbox_priorbox I1106 16:38:03.234876 13504 net.cpp:572] mbox_priorbox <- ctx_output4/relu_mbox_priorbox I1106 16:38:03.234882 13504 net.cpp:542] mbox_priorbox -> mbox_priorbox I1106 16:38:03.234899 13504 net.cpp:260] Setting up mbox_priorbox I1106 16:38:03.234906 13504 net.cpp:267] TEST Top shape for layer 88 'mbox_priorbox' 1 2 22848 (45696) I1106 16:38:03.234912 13504 layer_factory.hpp:172] Creating layer 'mbox_conf_reshape' of type 'Reshape' I1106 16:38:03.234917 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.234927 13504 net.cpp:200] Created Layer mbox_conf_reshape (89) I1106 16:38:03.234933 13504 net.cpp:572] mbox_conf_reshape <- mbox_conf I1106 16:38:03.234939 13504 net.cpp:542] mbox_conf_reshape -> mbox_conf_reshape I1106 16:38:03.234959 13504 net.cpp:260] Setting up mbox_conf_reshape I1106 16:38:03.234966 13504 net.cpp:267] TEST Top shape for layer 89 'mbox_conf_reshape' 8 5712 2 (91392) I1106 16:38:03.234972 13504 layer_factory.hpp:172] Creating layer 'mbox_conf_softmax' of type 'Softmax' I1106 16:38:03.234978 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.234989 13504 net.cpp:200] Created Layer mbox_conf_softmax (90) I1106 16:38:03.234992 13504 net.cpp:572] mbox_conf_softmax <- mbox_conf_reshape I1106 16:38:03.234997 13504 net.cpp:542] mbox_conf_softmax -> mbox_conf_softmax I1106 16:38:03.235033 13504 net.cpp:260] Setting up mbox_conf_softmax I1106 16:38:03.235036 13504 net.cpp:267] TEST Top shape for layer 90 'mbox_conf_softmax' 8 5712 2 (91392) I1106 16:38:03.235039 13504 layer_factory.hpp:172] Creating layer 'mbox_conf_flatten' of type 'Flatten' I1106 16:38:03.235042 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.235046 13504 net.cpp:200] Created Layer mbox_conf_flatten (91) I1106 16:38:03.235049 13504 net.cpp:572] mbox_conf_flatten <- mbox_conf_softmax I1106 16:38:03.235050 13504 net.cpp:542] mbox_conf_flatten -> mbox_conf_flatten I1106 16:38:03.235095 13504 net.cpp:260] Setting up mbox_conf_flatten I1106 16:38:03.235100 13504 net.cpp:267] TEST Top shape for layer 91 'mbox_conf_flatten' 8 11424 (91392) I1106 16:38:03.235103 13504 layer_factory.hpp:172] Creating layer 'detection_out' of type 'DetectionOutput' I1106 16:38:03.235111 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.235126 13504 net.cpp:200] Created Layer detection_out (92) I1106 16:38:03.235129 13504 net.cpp:572] detection_out <- mbox_loc I1106 16:38:03.235132 13504 net.cpp:572] detection_out <- mbox_conf_flatten I1106 16:38:03.235136 13504 net.cpp:572] detection_out <- mbox_priorbox I1106 16:38:03.235138 13504 net.cpp:542] detection_out -> detection_out I1106 16:38:03.235229 13504 net.cpp:260] Setting up detection_out I1106 16:38:03.235234 13504 net.cpp:267] TEST Top shape for layer 92 'detection_out' 1 1 1 7 (7) I1106 16:38:03.235236 13504 layer_factory.hpp:172] Creating layer 'detection_eval' of type 'DetectionEvaluate' I1106 16:38:03.235239 13504 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.235249 13504 net.cpp:200] Created Layer detection_eval (93) I1106 16:38:03.235251 13504 net.cpp:572] detection_eval <- detection_out I1106 16:38:03.235253 13504 net.cpp:572] detection_eval <- label I1106 16:38:03.235256 13504 net.cpp:542] detection_eval -> detection_eval I1106 16:38:03.235291 13504 net.cpp:260] Setting up detection_eval I1106 16:38:03.235296 13504 net.cpp:267] TEST Top shape for layer 93 'detection_eval' 1 1 2 5 (10) I1106 16:38:03.235299 13504 net.cpp:338] detection_eval does not need backward computation. I1106 16:38:03.235302 13504 net.cpp:338] detection_out does not need backward computation. I1106 16:38:03.235306 13504 net.cpp:338] mbox_conf_flatten does not need backward computation. I1106 16:38:03.235308 13504 net.cpp:338] mbox_conf_softmax does not need backward computation. I1106 16:38:03.235309 13504 net.cpp:338] mbox_conf_reshape does not need backward computation. I1106 16:38:03.235312 13504 net.cpp:338] mbox_priorbox does not need backward computation. I1106 16:38:03.235317 13504 net.cpp:338] mbox_conf does not need backward computation. I1106 16:38:03.235321 13504 net.cpp:338] mbox_loc does not need backward computation. I1106 16:38:03.235323 13504 net.cpp:338] ctx_output4/relu_mbox_priorbox does not need backward computation. I1106 16:38:03.235327 13504 net.cpp:338] ctx_output4/relu_mbox_conf_flat does not need backward computation. I1106 16:38:03.235329 13504 net.cpp:338] ctx_output4/relu_mbox_conf_perm does not need backward computation. I1106 16:38:03.235332 13504 net.cpp:338] ctx_output4/relu_mbox_conf does not need backward computation. I1106 16:38:03.235334 13504 net.cpp:338] ctx_output4/relu_mbox_loc_flat does not need backward computation. I1106 16:38:03.235335 13504 net.cpp:338] ctx_output4/relu_mbox_loc_perm does not need backward computation. I1106 16:38:03.235338 13504 net.cpp:338] ctx_output4/relu_mbox_loc does not need backward computation. I1106 16:38:03.235340 13504 net.cpp:338] ctx_output3/relu_mbox_priorbox does not need backward computation. I1106 16:38:03.235343 13504 net.cpp:338] ctx_output3/relu_mbox_conf_flat does not need backward computation. I1106 16:38:03.235345 13504 net.cpp:338] ctx_output3/relu_mbox_conf_perm does not need backward computation. I1106 16:38:03.235348 13504 net.cpp:338] ctx_output3/relu_mbox_conf does not need backward computation. I1106 16:38:03.235350 13504 net.cpp:338] ctx_output3/relu_mbox_loc_flat does not need backward computation. I1106 16:38:03.235352 13504 net.cpp:338] ctx_output3/relu_mbox_loc_perm does not need backward computation. I1106 16:38:03.235353 13504 net.cpp:338] ctx_output3/relu_mbox_loc does not need backward computation. I1106 16:38:03.235357 13504 net.cpp:338] ctx_output2/relu_mbox_priorbox does not need backward computation. I1106 16:38:03.235358 13504 net.cpp:338] ctx_output2/relu_mbox_conf_flat does not need backward computation. I1106 16:38:03.235360 13504 net.cpp:338] ctx_output2/relu_mbox_conf_perm does not need backward computation. I1106 16:38:03.235363 13504 net.cpp:338] ctx_output2/relu_mbox_conf does not need backward computation. I1106 16:38:03.235364 13504 net.cpp:338] ctx_output2/relu_mbox_loc_flat does not need backward computation. I1106 16:38:03.235373 13504 net.cpp:338] ctx_output2/relu_mbox_loc_perm does not need backward computation. I1106 16:38:03.235374 13504 net.cpp:338] ctx_output2/relu_mbox_loc does not need backward computation. I1106 16:38:03.235376 13504 net.cpp:338] ctx_output1/relu_mbox_priorbox does not need backward computation. I1106 16:38:03.235379 13504 net.cpp:338] ctx_output1/relu_mbox_conf_flat does not need backward computation. I1106 16:38:03.235381 13504 net.cpp:338] ctx_output1/relu_mbox_conf_perm does not need backward computation. I1106 16:38:03.235383 13504 net.cpp:338] ctx_output1/relu_mbox_conf does not need backward computation. I1106 16:38:03.235384 13504 net.cpp:338] ctx_output1/relu_mbox_loc_flat does not need backward computation. I1106 16:38:03.235388 13504 net.cpp:338] ctx_output1/relu_mbox_loc_perm does not need backward computation. I1106 16:38:03.235389 13504 net.cpp:338] ctx_output1/relu_mbox_loc does not need backward computation. I1106 16:38:03.235391 13504 net.cpp:338] ctx_output5/relu does not need backward computation. I1106 16:38:03.235394 13504 net.cpp:338] ctx_output5 does not need backward computation. I1106 16:38:03.235396 13504 net.cpp:338] ctx_output4_ctx_output4/relu_0_split does not need backward computation. I1106 16:38:03.235399 13504 net.cpp:338] ctx_output4/relu does not need backward computation. I1106 16:38:03.235400 13504 net.cpp:338] ctx_output4 does not need backward computation. I1106 16:38:03.235402 13504 net.cpp:338] ctx_output3_ctx_output3/relu_0_split does not need backward computation. I1106 16:38:03.235405 13504 net.cpp:338] ctx_output3/relu does not need backward computation. I1106 16:38:03.235407 13504 net.cpp:338] ctx_output3 does not need backward computation. I1106 16:38:03.235409 13504 net.cpp:338] ctx_output2_ctx_output2/relu_0_split does not need backward computation. I1106 16:38:03.235411 13504 net.cpp:338] ctx_output2/relu does not need backward computation. I1106 16:38:03.235414 13504 net.cpp:338] ctx_output2 does not need backward computation. I1106 16:38:03.235416 13504 net.cpp:338] ctx_output1_ctx_output1/relu_0_split does not need backward computation. I1106 16:38:03.235419 13504 net.cpp:338] ctx_output1/relu does not need backward computation. I1106 16:38:03.235420 13504 net.cpp:338] ctx_output1 does not need backward computation. I1106 16:38:03.235424 13504 net.cpp:338] pool8 does not need backward computation. I1106 16:38:03.235425 13504 net.cpp:338] pool7_pool7_0_split does not need backward computation. I1106 16:38:03.235428 13504 net.cpp:338] pool7 does not need backward computation. I1106 16:38:03.235431 13504 net.cpp:338] pool6_pool6_0_split does not need backward computation. I1106 16:38:03.235435 13504 net.cpp:338] pool6 does not need backward computation. I1106 16:38:03.235436 13504 net.cpp:338] res5a_branch2b_res5a_branch2b/relu_0_split does not need backward computation. I1106 16:38:03.235438 13504 net.cpp:338] res5a_branch2b/relu does not need backward computation. I1106 16:38:03.235440 13504 net.cpp:338] res5a_branch2b/bn does not need backward computation. I1106 16:38:03.235442 13504 net.cpp:338] res5a_branch2b does not need backward computation. I1106 16:38:03.235445 13504 net.cpp:338] res5a_branch2a/relu does not need backward computation. I1106 16:38:03.235446 13504 net.cpp:338] res5a_branch2a/bn does not need backward computation. I1106 16:38:03.235448 13504 net.cpp:338] res5a_branch2a does not need backward computation. I1106 16:38:03.235450 13504 net.cpp:338] pool4 does not need backward computation. I1106 16:38:03.235452 13504 net.cpp:338] res4a_branch2b_res4a_branch2b/relu_0_split does not need backward computation. I1106 16:38:03.235455 13504 net.cpp:338] res4a_branch2b/relu does not need backward computation. I1106 16:38:03.235456 13504 net.cpp:338] res4a_branch2b/bn does not need backward computation. I1106 16:38:03.235458 13504 net.cpp:338] res4a_branch2b does not need backward computation. I1106 16:38:03.235461 13504 net.cpp:338] res4a_branch2a/relu does not need backward computation. I1106 16:38:03.235467 13504 net.cpp:338] res4a_branch2a/bn does not need backward computation. I1106 16:38:03.235469 13504 net.cpp:338] res4a_branch2a does not need backward computation. I1106 16:38:03.235471 13504 net.cpp:338] pool3 does not need backward computation. I1106 16:38:03.235473 13504 net.cpp:338] res3a_branch2b/relu does not need backward computation. I1106 16:38:03.235476 13504 net.cpp:338] res3a_branch2b/bn does not need backward computation. I1106 16:38:03.235477 13504 net.cpp:338] res3a_branch2b does not need backward computation. I1106 16:38:03.235479 13504 net.cpp:338] res3a_branch2a/relu does not need backward computation. I1106 16:38:03.235482 13504 net.cpp:338] res3a_branch2a/bn does not need backward computation. I1106 16:38:03.235483 13504 net.cpp:338] res3a_branch2a does not need backward computation. I1106 16:38:03.235486 13504 net.cpp:338] pool2 does not need backward computation. I1106 16:38:03.235488 13504 net.cpp:338] res2a_branch2b/relu does not need backward computation. I1106 16:38:03.235491 13504 net.cpp:338] res2a_branch2b/bn does not need backward computation. I1106 16:38:03.235492 13504 net.cpp:338] res2a_branch2b does not need backward computation. I1106 16:38:03.235494 13504 net.cpp:338] res2a_branch2a/relu does not need backward computation. I1106 16:38:03.235496 13504 net.cpp:338] res2a_branch2a/bn does not need backward computation. I1106 16:38:03.235498 13504 net.cpp:338] res2a_branch2a does not need backward computation. I1106 16:38:03.235502 13504 net.cpp:338] pool1 does not need backward computation. I1106 16:38:03.235502 13504 net.cpp:338] conv1b/relu does not need backward computation. I1106 16:38:03.235505 13504 net.cpp:338] conv1b/bn does not need backward computation. I1106 16:38:03.235507 13504 net.cpp:338] conv1b does not need backward computation. I1106 16:38:03.235508 13504 net.cpp:338] conv1a/relu does not need backward computation. I1106 16:38:03.235510 13504 net.cpp:338] conv1a/bn does not need backward computation. I1106 16:38:03.235512 13504 net.cpp:338] conv1a does not need backward computation. I1106 16:38:03.235514 13504 net.cpp:338] data/bias does not need backward computation. I1106 16:38:03.235520 13504 net.cpp:338] data_data_0_split does not need backward computation. I1106 16:38:03.235522 13504 net.cpp:338] data does not need backward computation. I1106 16:38:03.235525 13504 net.cpp:380] This network produces output ctx_output5 I1106 16:38:03.235528 13504 net.cpp:380] This network produces output detection_eval I1106 16:38:03.235610 13504 net.cpp:403] Top memory (TEST) required for data: 1011843208 diff: 1011843208 I1106 16:38:03.235620 13504 net.cpp:406] Bottom memory (TEST) required for data: 1011794016 diff: 1011794016 I1106 16:38:03.235627 13504 net.cpp:409] Shared (in-place) memory (TEST) by data: 498106368 diff: 498106368 I1106 16:38:03.235632 13504 net.cpp:412] Parameters memory (TEST) required for data: 11946688 diff: 11946688 I1106 16:38:03.235637 13504 net.cpp:415] Parameters shared memory (TEST) by data: 0 diff: 0 I1106 16:38:03.235641 13504 net.cpp:421] Network initialization done. I1106 16:38:03.236054 13504 solver.cpp:55] Solver scaffolding done. I1106 16:38:03.238921 13504 caffe.cpp:158] Finetuning from training/imagenet_jacintonet11v2_iter_320000.caffemodel I1106 16:38:03.247213 13504 net.cpp:1153] Copying source layer data Type:Data #blobs=0 I1106 16:38:03.247318 13504 net.cpp:1153] Copying source layer data/bias Type:Bias #blobs=1 I1106 16:38:03.247339 13504 net.cpp:1153] Copying source layer conv1a Type:Convolution #blobs=2 I1106 16:38:03.247366 13504 net.cpp:1153] Copying source layer conv1a/bn Type:BatchNorm #blobs=5 I1106 16:38:03.247624 13504 net.cpp:1166] BN legacy DIGITS format detected ... I1106 16:38:03.247639 13504 net.cpp:1172] BN Transforming to new format completed. I1106 16:38:03.247648 13504 net.cpp:1153] Copying source layer conv1a/relu Type:ReLU #blobs=0 I1106 16:38:03.247658 13504 net.cpp:1153] Copying source layer conv1b Type:Convolution #blobs=2 I1106 16:38:03.247725 13504 net.cpp:1153] Copying source layer conv1b/bn Type:BatchNorm #blobs=5 I1106 16:38:03.247853 13504 net.cpp:1166] BN legacy DIGITS format detected ... I1106 16:38:03.247865 13504 net.cpp:1172] BN Transforming to new format completed. I1106 16:38:03.247874 13504 net.cpp:1153] Copying source layer conv1b/relu Type:ReLU #blobs=0 I1106 16:38:03.247884 13504 net.cpp:1153] Copying source layer pool1 Type:Pooling #blobs=0 I1106 16:38:03.247892 13504 net.cpp:1153] Copying source layer res2a_branch2a Type:Convolution #blobs=2 I1106 16:38:03.248704 13504 net.cpp:1153] Copying source layer res2a_branch2a/bn Type:BatchNorm #blobs=5 I1106 16:38:03.248828 13504 net.cpp:1166] BN legacy DIGITS format detected ... I1106 16:38:03.248831 13504 net.cpp:1172] BN Transforming to new format completed. I1106 16:38:03.248833 13504 net.cpp:1153] Copying source layer res2a_branch2a/relu Type:ReLU #blobs=0 I1106 16:38:03.248842 13504 net.cpp:1153] Copying source layer res2a_branch2b Type:Convolution #blobs=2 I1106 16:38:03.248908 13504 net.cpp:1153] Copying source layer res2a_branch2b/bn Type:BatchNorm #blobs=5 I1106 16:38:03.249063 13504 net.cpp:1166] BN legacy DIGITS format detected ... I1106 16:38:03.249068 13504 net.cpp:1172] BN Transforming to new format completed. I1106 16:38:03.249069 13504 net.cpp:1153] Copying source layer res2a_branch2b/relu Type:ReLU #blobs=0 I1106 16:38:03.249071 13504 net.cpp:1153] Copying source layer pool2 Type:Pooling #blobs=0 I1106 16:38:03.249080 13504 net.cpp:1153] Copying source layer res3a_branch2a Type:Convolution #blobs=2 I1106 16:38:03.249554 13504 net.cpp:1153] Copying source layer res3a_branch2a/bn Type:BatchNorm #blobs=5 I1106 16:38:03.249646 13504 net.cpp:1166] BN legacy DIGITS format detected ... I1106 16:38:03.249655 13504 net.cpp:1172] BN Transforming to new format completed. I1106 16:38:03.249660 13504 net.cpp:1153] Copying source layer res3a_branch2a/relu Type:ReLU #blobs=0 I1106 16:38:03.249665 13504 net.cpp:1153] Copying source layer res3a_branch2b Type:Convolution #blobs=2 I1106 16:38:03.249899 13504 net.cpp:1153] Copying source layer res3a_branch2b/bn Type:BatchNorm #blobs=5 I1106 16:38:03.249985 13504 net.cpp:1166] BN legacy DIGITS format detected ... I1106 16:38:03.249989 13504 net.cpp:1172] BN Transforming to new format completed. I1106 16:38:03.249990 13504 net.cpp:1153] Copying source layer res3a_branch2b/relu Type:ReLU #blobs=0 I1106 16:38:03.249994 13504 net.cpp:1153] Copying source layer pool3 Type:Pooling #blobs=0 I1106 16:38:03.250000 13504 net.cpp:1153] Copying source layer res4a_branch2a Type:Convolution #blobs=2 I1106 16:38:03.251884 13504 net.cpp:1153] Copying source layer res4a_branch2a/bn Type:BatchNorm #blobs=5 I1106 16:38:03.252149 13504 net.cpp:1166] BN legacy DIGITS format detected ... I1106 16:38:03.252163 13504 net.cpp:1172] BN Transforming to new format completed. I1106 16:38:03.252169 13504 net.cpp:1153] Copying source layer res4a_branch2a/relu Type:ReLU #blobs=0 I1106 16:38:03.252176 13504 net.cpp:1153] Copying source layer res4a_branch2b Type:Convolution #blobs=2 I1106 16:38:03.253118 13504 net.cpp:1153] Copying source layer res4a_branch2b/bn Type:BatchNorm #blobs=5 I1106 16:38:03.253253 13504 net.cpp:1166] BN legacy DIGITS format detected ... I1106 16:38:03.253258 13504 net.cpp:1172] BN Transforming to new format completed. I1106 16:38:03.253260 13504 net.cpp:1153] Copying source layer res4a_branch2b/relu Type:ReLU #blobs=0 I1106 16:38:03.253262 13504 net.cpp:1153] Copying source layer pool4 Type:Pooling #blobs=0 I1106 16:38:03.253265 13504 net.cpp:1153] Copying source layer res5a_branch2a Type:Convolution #blobs=2 I1106 16:38:03.261678 13504 net.cpp:1153] Copying source layer res5a_branch2a/bn Type:BatchNorm #blobs=5 I1106 16:38:03.261842 13504 net.cpp:1166] BN legacy DIGITS format detected ... I1106 16:38:03.261845 13504 net.cpp:1172] BN Transforming to new format completed. I1106 16:38:03.261848 13504 net.cpp:1153] Copying source layer res5a_branch2a/relu Type:ReLU #blobs=0 I1106 16:38:03.261850 13504 net.cpp:1153] Copying source layer res5a_branch2b Type:Convolution #blobs=2 I1106 16:38:03.265882 13504 net.cpp:1153] Copying source layer res5a_branch2b/bn Type:BatchNorm #blobs=5 I1106 16:38:03.266013 13504 net.cpp:1166] BN legacy DIGITS format detected ... I1106 16:38:03.266017 13504 net.cpp:1172] BN Transforming to new format completed. I1106 16:38:03.266019 13504 net.cpp:1153] Copying source layer res5a_branch2b/relu Type:ReLU #blobs=0 I1106 16:38:03.266021 13504 net.cpp:1137] Ignoring source layer pool5 I1106 16:38:03.266022 13504 net.cpp:1137] Ignoring source layer fc1000 I1106 16:38:03.266024 13504 net.cpp:1137] Ignoring source layer loss I1106 16:38:03.270052 13504 net.cpp:1153] Copying source layer data Type:Data #blobs=0 I1106 16:38:03.270078 13504 net.cpp:1153] Copying source layer data/bias Type:Bias #blobs=1 I1106 16:38:03.270087 13504 net.cpp:1153] Copying source layer conv1a Type:Convolution #blobs=2 I1106 16:38:03.270107 13504 net.cpp:1153] Copying source layer conv1a/bn Type:BatchNorm #blobs=5 I1106 16:38:03.270259 13504 net.cpp:1166] BN legacy DIGITS format detected ... I1106 16:38:03.270268 13504 net.cpp:1172] BN Transforming to new format completed. I1106 16:38:03.270273 13504 net.cpp:1153] Copying source layer conv1a/relu Type:ReLU #blobs=0 I1106 16:38:03.270280 13504 net.cpp:1153] Copying source layer conv1b Type:Convolution #blobs=2 I1106 16:38:03.270301 13504 net.cpp:1153] Copying source layer conv1b/bn Type:BatchNorm #blobs=5 I1106 16:38:03.270407 13504 net.cpp:1166] BN legacy DIGITS format detected ... I1106 16:38:03.270416 13504 net.cpp:1172] BN Transforming to new format completed. I1106 16:38:03.270421 13504 net.cpp:1153] Copying source layer conv1b/relu Type:ReLU #blobs=0 I1106 16:38:03.270426 13504 net.cpp:1153] Copying source layer pool1 Type:Pooling #blobs=0 I1106 16:38:03.270432 13504 net.cpp:1153] Copying source layer res2a_branch2a Type:Convolution #blobs=2 I1106 16:38:03.270552 13504 net.cpp:1153] Copying source layer res2a_branch2a/bn Type:BatchNorm #blobs=5 I1106 16:38:03.270668 13504 net.cpp:1166] BN legacy DIGITS format detected ... I1106 16:38:03.270674 13504 net.cpp:1172] BN Transforming to new format completed. I1106 16:38:03.270679 13504 net.cpp:1153] Copying source layer res2a_branch2a/relu Type:ReLU #blobs=0 I1106 16:38:03.270685 13504 net.cpp:1153] Copying source layer res2a_branch2b Type:Convolution #blobs=2 I1106 16:38:03.270750 13504 net.cpp:1153] Copying source layer res2a_branch2b/bn Type:BatchNorm #blobs=5 I1106 16:38:03.270859 13504 net.cpp:1166] BN legacy DIGITS format detected ... I1106 16:38:03.270861 13504 net.cpp:1172] BN Transforming to new format completed. I1106 16:38:03.270864 13504 net.cpp:1153] Copying source layer res2a_branch2b/relu Type:ReLU #blobs=0 I1106 16:38:03.270866 13504 net.cpp:1153] Copying source layer pool2 Type:Pooling #blobs=0 I1106 16:38:03.270869 13504 net.cpp:1153] Copying source layer res3a_branch2a Type:Convolution #blobs=2 I1106 16:38:03.271329 13504 net.cpp:1153] Copying source layer res3a_branch2a/bn Type:BatchNorm #blobs=5 I1106 16:38:03.271471 13504 net.cpp:1166] BN legacy DIGITS format detected ... I1106 16:38:03.271476 13504 net.cpp:1172] BN Transforming to new format completed. I1106 16:38:03.271477 13504 net.cpp:1153] Copying source layer res3a_branch2a/relu Type:ReLU #blobs=0 I1106 16:38:03.271478 13504 net.cpp:1153] Copying source layer res3a_branch2b Type:Convolution #blobs=2 I1106 16:38:03.271715 13504 net.cpp:1153] Copying source layer res3a_branch2b/bn Type:BatchNorm #blobs=5 I1106 16:38:03.271798 13504 net.cpp:1166] BN legacy DIGITS format detected ... I1106 16:38:03.271801 13504 net.cpp:1172] BN Transforming to new format completed. I1106 16:38:03.271803 13504 net.cpp:1153] Copying source layer res3a_branch2b/relu Type:ReLU #blobs=0 I1106 16:38:03.271806 13504 net.cpp:1153] Copying source layer pool3 Type:Pooling #blobs=0 I1106 16:38:03.271807 13504 net.cpp:1153] Copying source layer res4a_branch2a Type:Convolution #blobs=2 I1106 16:38:03.273629 13504 net.cpp:1153] Copying source layer res4a_branch2a/bn Type:BatchNorm #blobs=5 I1106 16:38:03.273753 13504 net.cpp:1166] BN legacy DIGITS format detected ... I1106 16:38:03.273767 13504 net.cpp:1172] BN Transforming to new format completed. I1106 16:38:03.273769 13504 net.cpp:1153] Copying source layer res4a_branch2a/relu Type:ReLU #blobs=0 I1106 16:38:03.273772 13504 net.cpp:1153] Copying source layer res4a_branch2b Type:Convolution #blobs=2 I1106 16:38:03.274688 13534 data_reader.cpp:320] Restarting data pre-fetching I1106 16:38:03.274713 13504 net.cpp:1153] Copying source layer res4a_branch2b/bn Type:BatchNorm #blobs=5 I1106 16:38:03.274823 13504 net.cpp:1166] BN legacy DIGITS format detected ... I1106 16:38:03.274827 13504 net.cpp:1172] BN Transforming to new format completed. I1106 16:38:03.274828 13504 net.cpp:1153] Copying source layer res4a_branch2b/relu Type:ReLU #blobs=0 I1106 16:38:03.274845 13504 net.cpp:1153] Copying source layer pool4 Type:Pooling #blobs=0 I1106 16:38:03.274848 13504 net.cpp:1153] Copying source layer res5a_branch2a Type:Convolution #blobs=2 I1106 16:38:03.282217 13504 net.cpp:1153] Copying source layer res5a_branch2a/bn Type:BatchNorm #blobs=5 I1106 16:38:03.282354 13504 net.cpp:1166] BN legacy DIGITS format detected ... I1106 16:38:03.282358 13504 net.cpp:1172] BN Transforming to new format completed. I1106 16:38:03.282361 13504 net.cpp:1153] Copying source layer res5a_branch2a/relu Type:ReLU #blobs=0 I1106 16:38:03.282363 13504 net.cpp:1153] Copying source layer res5a_branch2b Type:Convolution #blobs=2 I1106 16:38:03.286025 13504 net.cpp:1153] Copying source layer res5a_branch2b/bn Type:BatchNorm #blobs=5 I1106 16:38:03.286170 13504 net.cpp:1166] BN legacy DIGITS format detected ... I1106 16:38:03.286180 13504 net.cpp:1172] BN Transforming to new format completed. I1106 16:38:03.286185 13504 net.cpp:1153] Copying source layer res5a_branch2b/relu Type:ReLU #blobs=0 I1106 16:38:03.286190 13504 net.cpp:1137] Ignoring source layer pool5 I1106 16:38:03.286195 13504 net.cpp:1137] Ignoring source layer fc1000 I1106 16:38:03.286200 13504 net.cpp:1137] Ignoring source layer loss I1106 16:38:03.286268 13504 caffe.cpp:260] Starting Optimization I1106 16:38:03.286281 13504 solver.cpp:453] Solving ssdJacintoNetV2 I1106 16:38:03.286286 13504 solver.cpp:454] Learning Rate Policy: poly I1106 16:38:03.286314 13504 net.cpp:1483] [0] Reserving 11927808 bytes of shared learnable space for type FLOAT I1106 16:38:03.302136 13504 solver.cpp:269] Initial Test started... I1106 16:38:03.302162 13504 solver.cpp:635] Iteration 0, Testing net (#0) I1106 16:38:03.303028 13536 common.cpp:528] NVML initialized, thread 13536 I1106 16:38:03.303880 13504 net.cpp:1071] Ignoring source layer mbox_loss I1106 16:38:03.345955 13536 common.cpp:550] NVML succeeded to set CPU affinity on device 0, thread 13536 F1106 16:38:03.368181 13504 solver.cpp:668] Check failed: result[j]->width() == 5 (3 vs. 5) *** Check failure stack trace: *** @ 0x7f3070c535cd google::LogMessage::Fail() @ 0x7f3070c55433 google::LogMessage::SendToLog() @ 0x7f3070c5315b google::LogMessage::Flush() @ 0x7f3070c55e1e google::LogMessageFatal::~LogMessageFatal() @ 0x7f3071cff728 caffe::Solver::TestDetection() @ 0x7f3071d00547 caffe::Solver::TestAll() @ 0x7f3071d010ac caffe::Solver::Step() @ 0x7f3071d03202 caffe::Solver::Solve() @ 0x41053d train() @ 0x40d1f0 main @ 0x7f306f3d5830 __libc_start_main @ 0x40de89 _start @ (nil) (unknown) I1106 16:38:03.848222 13539 caffe.cpp:902] This is NVCaffe 0.17.0 started at Wed Nov 6 16:38:03 2019 I1106 16:38:03.848366 13539 caffe.cpp:904] CuDNN version: 7601 I1106 16:38:03.848371 13539 caffe.cpp:905] CuBLAS version: 10201 I1106 16:38:03.848371 13539 caffe.cpp:906] CUDA version: 10010 I1106 16:38:03.848373 13539 caffe.cpp:907] CUDA driver version: 10010 I1106 16:38:03.848390 13539 caffe.cpp:908] Arguments: [0]: /home/liuyuyuan/caffe-jacinto/build/tools/caffe.bin [1]: train [2]: --solver=training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/l1reg/solver.prototxt [3]: --weights=training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/initial/ti-custom-cfg1_ssdJacintoNetV2_iter_120000.caffemodel [4]: --gpu [5]: 0 I1106 16:38:03.880937 13539 gpu_memory.cpp:105] GPUMemory::Manager initialized I1106 16:38:03.881264 13539 gpu_memory.cpp:107] Total memory: 6193479680, Free: 3154903040, dev_info[0]: total=6193479680 free=3154903040 I1106 16:38:03.881269 13539 caffe.cpp:226] Using GPUs 0 I1106 16:38:03.881496 13539 caffe.cpp:230] GPU 0: GeForce GTX 1660 Ti I1106 16:38:03.881542 13539 solver.cpp:41] Solver data type: FLOAT I1106 16:38:03.891990 13539 solver.cpp:44] Initializing solver from parameters: train_net: "training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/l1reg/train.prototxt" test_net: "training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/l1reg/test.prototxt" test_iter: 3 test_interval: 2000 base_lr: 0.001 display: 100 max_iter: 120000 lr_policy: "poly" gamma: 0.1 power: 4 momentum: 0.9 weight_decay: 1e-05 snapshot: 2000 snapshot_prefix: "training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/l1reg/ti-custom-cfg1_ssdJacintoNetV2" solver_mode: GPU device_id: 0 random_seed: 33 debug_info: false train_state { level: 0 stage: "" } snapshot_after_train: true regularization_type: "L1" test_initialization: true average_loss: 10 stepvalue: 60000 stepvalue: 9000 stepvalue: 300000 iter_size: 8 type: "SGD" eval_type: "detection" ap_version: "11point" show_per_class_result: true I1106 16:38:03.892076 13539 solver.cpp:76] Creating training net from train_net file: training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/l1reg/train.prototxt I1106 16:38:03.893046 13539 net.cpp:80] Initializing net from parameters: name: "ssdJacintoNetV2" state { phase: TRAIN level: 0 stage: "" } layer { name: "data" type: "AnnotatedData" top: "data" top: "label" include { phase: TRAIN } transform_param { mirror: true mean_value: 0 mean_value: 0 mean_value: 0 force_color: false resize_param { prob: 1 resize_mode: WARP height: 320 width: 768 interp_mode: LINEAR interp_mode: AREA interp_mode: NEAREST interp_mode: CUBIC interp_mode: LANCZOS4 } emit_constraint { emit_type: CENTER } crop_h: 320 crop_w: 768 distort_param { brightness_prob: 0.5 brightness_delta: 32 contrast_prob: 0.5 contrast_lower: 0.5 contrast_upper: 1.5 hue_prob: 0.5 hue_delta: 18 saturation_prob: 0.5 saturation_lower: 0.5 saturation_upper: 1.5 random_order_prob: 0 } expand_param { prob: 0.5 max_expand_ratio: 4 } } data_param { source: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_trainval_lmdb" batch_size: 4 backend: LMDB threads: 4 parser_threads: 4 } annotated_data_param { batch_sampler { max_sample: 1 max_trials: 1 } batch_sampler { sampler { min_scale: 0.3 max_scale: 1 min_aspect_ratio: 0.5 max_aspect_ratio: 2 } sample_constraint { min_jaccard_overlap: 0.1 } max_sample: 1 max_trials: 50 } batch_sampler { sampler { min_scale: 0.3 max_scale: 1 min_aspect_ratio: 0.5 max_aspect_ratio: 2 } sample_constraint { min_jaccard_overlap: 0.3 } max_sample: 1 max_trials: 50 } batch_sampler { sampler { min_scale: 0.3 max_scale: 1 min_aspect_ratio: 0.5 max_aspect_ratio: 2 } sample_constraint { min_jaccard_overlap: 0.5 } max_sample: 1 max_trials: 50 } batch_sampler { sampler { min_scale: 0.3 max_scale: 1 min_aspect_ratio: 0.5 max_aspect_ratio: 2 } sample_constraint { min_jaccard_overlap: 0.7 } max_sample: 1 max_trials: 50 } batch_sampler { sampler { min_scale: 0.3 max_scale: 1 min_aspect_ratio: 0.5 max_aspect_ratio: 2 } sample_constraint { min_jaccard_overlap: 0.9 } max_sample: 1 max_trials: 50 } batch_sampler { sampler { min_scale: 0.3 max_scale: 1 min_aspect_ratio: 0.5 max_aspect_ratio: 2 } sample_constraint { max_jaccard_overlap: 1 } max_sample: 1 max_trials: 50 } label_map_file: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/labelmap.prototxt" } } layer { name: "data/bias" type: "Bias" bottom: "data" top: "data/bias" param { lr_mult: 0 decay_mult: 0 } bias_param { filler { type: "constant" value: -128 } } } layer { name: "conv1a" type: "Convolution" bottom: "data/bias" top: "conv1a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 32 bias_term: true pad: 2 kernel_size: 5 group: 1 stride: 2 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "conv1a/bn" type: "BatchNorm" bottom: "conv1a" top: "conv1a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "conv1a/relu" type: "ReLU" bottom: "conv1a" top: "conv1a" } layer { name: "conv1b" type: "Convolution" bottom: "conv1a" top: "conv1b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 32 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "conv1b/bn" type: "BatchNorm" bottom: "conv1b" top: "conv1b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "conv1b/relu" type: "ReLU" bottom: "conv1b" top: "conv1b" } layer { name: "pool1" type: "Pooling" bottom: "conv1b" top: "pool1" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res2a_branch2a" type: "Convolution" bottom: "pool1" top: "res2a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 64 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res2a_branch2a/bn" type: "BatchNorm" bottom: "res2a_branch2a" top: "res2a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res2a_branch2a/relu" type: "ReLU" bottom: "res2a_branch2a" top: "res2a_branch2a" } layer { name: "res2a_branch2b" type: "Convolution" bottom: "res2a_branch2a" top: "res2a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 64 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res2a_branch2b/bn" type: "BatchNorm" bottom: "res2a_branch2b" top: "res2a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res2a_branch2b/relu" type: "ReLU" bottom: "res2a_branch2b" top: "res2a_branch2b" } layer { name: "pool2" type: "Pooling" bottom: "res2a_branch2b" top: "pool2" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res3a_branch2a" type: "Convolution" bottom: "pool2" top: "res3a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 128 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res3a_branch2a/bn" type: "BatchNorm" bottom: "res3a_branch2a" top: "res3a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res3a_branch2a/relu" type: "ReLU" bottom: "res3a_branch2a" top: "res3a_branch2a" } layer { name: "res3a_branch2b" type: "Convolution" bottom: "res3a_branch2a" top: "res3a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 128 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res3a_branch2b/bn" type: "BatchNorm" bottom: "res3a_branch2b" top: "res3a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res3a_branch2b/relu" type: "ReLU" bottom: "res3a_branch2b" top: "res3a_branch2b" } layer { name: "pool3" type: "Pooling" bottom: "res3a_branch2b" top: "pool3" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res4a_branch2a" type: "Convolution" bottom: "pool3" top: "res4a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res4a_branch2a/bn" type: "BatchNorm" bottom: "res4a_branch2a" top: "res4a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res4a_branch2a/relu" type: "ReLU" bottom: "res4a_branch2a" top: "res4a_branch2a" } layer { name: "res4a_branch2b" type: "Convolution" bottom: "res4a_branch2a" top: "res4a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res4a_branch2b/bn" type: "BatchNorm" bottom: "res4a_branch2b" top: "res4a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res4a_branch2b/relu" type: "ReLU" bottom: "res4a_branch2b" top: "res4a_branch2b" } layer { name: "pool4" type: "Pooling" bottom: "res4a_branch2b" top: "pool4" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res5a_branch2a" type: "Convolution" bottom: "pool4" top: "res5a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 512 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res5a_branch2a/bn" type: "BatchNorm" bottom: "res5a_branch2a" top: "res5a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res5a_branch2a/relu" type: "ReLU" bottom: "res5a_branch2a" top: "res5a_branch2a" } layer { name: "res5a_branch2b" type: "Convolution" bottom: "res5a_branch2a" top: "res5a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 512 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res5a_branch2b/bn" type: "BatchNorm" bottom: "res5a_branch2b" top: "res5a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res5a_branch2b/relu" type: "ReLU" bottom: "res5a_branch2b" top: "res5a_branch2b" } layer { name: "pool6" type: "Pooling" bottom: "res5a_branch2b" top: "pool6" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "pool7" type: "Pooling" bottom: "pool6" top: "pool7" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "pool8" type: "Pooling" bottom: "pool7" top: "pool8" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "ctx_output1" type: "Convolution" bottom: "res4a_branch2b" top: "ctx_output1" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu" type: "ReLU" bottom: "ctx_output1" top: "ctx_output1" } layer { name: "ctx_output2" type: "Convolution" bottom: "res5a_branch2b" top: "ctx_output2" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu" type: "ReLU" bottom: "ctx_output2" top: "ctx_output2" } layer { name: "ctx_output3" type: "Convolution" bottom: "pool6" top: "ctx_output3" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu" type: "ReLU" bottom: "ctx_output3" top: "ctx_output3" } layer { name: "ctx_output4" type: "Convolution" bottom: "pool7" top: "ctx_output4" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu" type: "ReLU" bottom: "ctx_output4" top: "ctx_output4" } layer { name: "ctx_output5" type: "Convolution" bottom: "pool8" top: "ctx_output5" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output5/relu" type: "ReLU" bottom: "ctx_output5" top: "ctx_output5" } layer { name: "ctx_output1/relu_mbox_loc" type: "Convolution" bottom: "ctx_output1" top: "ctx_output1/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 16 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output1/relu_mbox_loc" top: "ctx_output1/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output1/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output1/relu_mbox_loc_perm" top: "ctx_output1/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output1/relu_mbox_conf" type: "Convolution" bottom: "ctx_output1" top: "ctx_output1/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 8 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output1/relu_mbox_conf" top: "ctx_output1/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output1/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output1/relu_mbox_conf_perm" top: "ctx_output1/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output1/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output1" bottom: "data" top: "ctx_output1/relu_mbox_priorbox" prior_box_param { min_size: 14.72 max_size: 36.8 aspect_ratio: 2 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output2/relu_mbox_loc" type: "Convolution" bottom: "ctx_output2" top: "ctx_output2/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 24 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output2/relu_mbox_loc" top: "ctx_output2/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output2/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output2/relu_mbox_loc_perm" top: "ctx_output2/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output2/relu_mbox_conf" type: "Convolution" bottom: "ctx_output2" top: "ctx_output2/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 12 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output2/relu_mbox_conf" top: "ctx_output2/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output2/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output2/relu_mbox_conf_perm" top: "ctx_output2/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output2/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output2" bottom: "data" top: "ctx_output2/relu_mbox_priorbox" prior_box_param { min_size: 36.8 max_size: 132.48 aspect_ratio: 2 aspect_ratio: 3 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output3/relu_mbox_loc" type: "Convolution" bottom: "ctx_output3" top: "ctx_output3/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 24 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output3/relu_mbox_loc" top: "ctx_output3/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output3/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output3/relu_mbox_loc_perm" top: "ctx_output3/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output3/relu_mbox_conf" type: "Convolution" bottom: "ctx_output3" top: "ctx_output3/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 12 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output3/relu_mbox_conf" top: "ctx_output3/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output3/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output3/relu_mbox_conf_perm" top: "ctx_output3/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output3/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output3" bottom: "data" top: "ctx_output3/relu_mbox_priorbox" prior_box_param { min_size: 132.48 max_size: 228.16 aspect_ratio: 2 aspect_ratio: 3 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output4/relu_mbox_loc" type: "Convolution" bottom: "ctx_output4" top: "ctx_output4/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 16 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output4/relu_mbox_loc" top: "ctx_output4/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output4/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output4/relu_mbox_loc_perm" top: "ctx_output4/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output4/relu_mbox_conf" type: "Convolution" bottom: "ctx_output4" top: "ctx_output4/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 8 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output4/relu_mbox_conf" top: "ctx_output4/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output4/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output4/relu_mbox_conf_perm" top: "ctx_output4/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output4/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output4" bottom: "data" top: "ctx_output4/relu_mbox_priorbox" prior_box_param { min_size: 228.16 max_size: 323.84 aspect_ratio: 2 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "mbox_loc" type: "Concat" bottom: "ctx_output1/relu_mbox_loc_flat" bottom: "ctx_output2/relu_mbox_loc_flat" bottom: "ctx_output3/relu_mbox_loc_flat" bottom: "ctx_output4/relu_mbox_loc_flat" top: "mbox_loc" concat_param { axis: 1 } } layer { name: "mbox_conf" type: "Concat" bottom: "ctx_output1/relu_mbox_conf_flat" bottom: "ctx_output2/relu_mbox_conf_flat" bottom: "ctx_output3/relu_mbox_conf_flat" bottom: "ctx_output4/relu_mbox_conf_flat" top: "mbox_conf" concat_param { axis: 1 } } layer { name: "mbox_priorbox" type: "Concat" bottom: "ctx_output1/relu_mbox_priorbox" bottom: "ctx_output2/relu_mbox_priorbox" bottom: "ctx_output3/relu_mbox_priorbox" bottom: "ctx_output4/relu_mbox_priorbox" top: "mbox_priorbox" concat_param { axis: 2 } } layer { name: "mbox_loss" type: "MultiBoxLoss" bottom: "mbox_loc" bottom: "mbox_conf" bottom: "mbox_priorbox" bottom: "label" top: "mbox_loss" include { phase: TRAIN } propagate_down: true propagate_down: true propagate_down: false propagate_down: false loss_param { normalization: VALID } multibox_loss_param { loc_loss_type: SMOOTH_L1 conf_loss_type: SOFTMAX loc_weight: 1 num_classes: 2 share_location: true match_type: PER_PREDICTION overlap_threshold: 0.5 use_prior_for_matching: true background_label_id: 0 use_difficult_gt: true neg_pos_ratio: 3 neg_overlap: 0.5 code_type: CENTER_SIZE ignore_cross_boundary_bbox: false mining_type: MAX_NEGATIVE ignore_difficult_gt: false } } I1106 16:38:03.893316 13539 net.cpp:110] Using FLOAT as default forward math type I1106 16:38:03.893329 13539 net.cpp:116] Using FLOAT as default backward math type I1106 16:38:03.893337 13539 layer_factory.hpp:172] Creating layer 'data' of type 'AnnotatedData' I1106 16:38:03.893343 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.893415 13539 internal_thread.cpp:19] Starting 1 internal thread(s) on device 0 I1106 16:38:03.893780 13552 blocking_queue.cpp:40] Data layer prefetch queue empty I1106 16:38:03.893796 13539 net.cpp:200] Created Layer data (0) I1106 16:38:03.893802 13539 net.cpp:542] data -> data I1106 16:38:03.893813 13539 net.cpp:542] data -> label I1106 16:38:03.893832 13539 data_reader.cpp:58] Data Reader threads: 4, out queues: 16, depth: 4 I1106 16:38:03.893867 13539 internal_thread.cpp:19] Starting 4 internal thread(s) on device 0 I1106 16:38:03.894215 13554 db_lmdb.cpp:36] Opened lmdb /home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_trainval_lmdb I1106 16:38:03.894457 13555 db_lmdb.cpp:36] Opened lmdb /home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_trainval_lmdb I1106 16:38:03.894711 13556 db_lmdb.cpp:36] Opened lmdb /home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_trainval_lmdb I1106 16:38:03.895006 13553 db_lmdb.cpp:36] Opened lmdb /home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_trainval_lmdb I1106 16:38:03.895813 13539 annotated_data_layer.cpp:105] output data size: 4,3,320,768 I1106 16:38:03.896001 13539 annotated_data_layer.cpp:150] [0] Output data size: 4, 3, 320, 768 I1106 16:38:03.896045 13539 internal_thread.cpp:19] Starting 4 internal thread(s) on device 0 I1106 16:38:03.896394 13558 data_layer.cpp:105] [0] Parser threads: 4 I1106 16:38:03.896401 13558 data_layer.cpp:107] [0] Transformer threads: 4 I1106 16:38:03.896929 13539 net.cpp:260] Setting up data I1106 16:38:03.899550 13539 net.cpp:267] TRAIN Top shape for layer 0 'data' 4 3 320 768 (2949120) I1106 16:38:03.899636 13539 net.cpp:267] TRAIN Top shape for layer 0 'data' 1 1 5 8 (40) I1106 16:38:03.899657 13539 layer_factory.hpp:172] Creating layer 'data_data_0_split' of type 'Split' I1106 16:38:03.899668 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.899716 13539 net.cpp:200] Created Layer data_data_0_split (1) I1106 16:38:03.899727 13539 net.cpp:572] data_data_0_split <- data I1106 16:38:03.899776 13539 net.cpp:542] data_data_0_split -> data_data_0_split_0 I1106 16:38:03.899785 13539 net.cpp:542] data_data_0_split -> data_data_0_split_1 I1106 16:38:03.899791 13539 net.cpp:542] data_data_0_split -> data_data_0_split_2 I1106 16:38:03.899796 13539 net.cpp:542] data_data_0_split -> data_data_0_split_3 I1106 16:38:03.899801 13539 net.cpp:542] data_data_0_split -> data_data_0_split_4 I1106 16:38:03.908958 13539 net.cpp:260] Setting up data_data_0_split I1106 16:38:03.909003 13539 net.cpp:267] TRAIN Top shape for layer 1 'data_data_0_split' 4 3 320 768 (2949120) I1106 16:38:03.909008 13539 net.cpp:267] TRAIN Top shape for layer 1 'data_data_0_split' 4 3 320 768 (2949120) I1106 16:38:03.909009 13539 net.cpp:267] TRAIN Top shape for layer 1 'data_data_0_split' 4 3 320 768 (2949120) I1106 16:38:03.909013 13539 net.cpp:267] TRAIN Top shape for layer 1 'data_data_0_split' 4 3 320 768 (2949120) I1106 16:38:03.909014 13539 net.cpp:267] TRAIN Top shape for layer 1 'data_data_0_split' 4 3 320 768 (2949120) I1106 16:38:03.909021 13539 layer_factory.hpp:172] Creating layer 'data/bias' of type 'Bias' I1106 16:38:03.909029 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.909051 13539 net.cpp:200] Created Layer data/bias (2) I1106 16:38:03.909055 13539 net.cpp:572] data/bias <- data_data_0_split_0 I1106 16:38:03.909062 13539 net.cpp:542] data/bias -> data/bias I1106 16:38:03.909353 13539 net.cpp:260] Setting up data/bias I1106 16:38:03.909368 13539 net.cpp:267] TRAIN Top shape for layer 2 'data/bias' 4 3 320 768 (2949120) I1106 16:38:03.909397 13539 layer_factory.hpp:172] Creating layer 'conv1a' of type 'Convolution' I1106 16:38:03.909401 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:03.909458 13539 net.cpp:200] Created Layer conv1a (3) I1106 16:38:03.909461 13539 net.cpp:572] conv1a <- data/bias I1106 16:38:03.909464 13539 net.cpp:542] conv1a -> conv1a I1106 16:38:04.017482 13553 data_reader.cpp:320] Restarting data pre-fetching I1106 16:38:05.304039 13539 net.cpp:260] Setting up conv1a I1106 16:38:05.304095 13539 net.cpp:267] TRAIN Top shape for layer 3 'conv1a' 4 32 160 384 (7864320) I1106 16:38:05.304113 13539 layer_factory.hpp:172] Creating layer 'conv1a/bn' of type 'BatchNorm' I1106 16:38:05.304122 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.304139 13539 net.cpp:200] Created Layer conv1a/bn (4) I1106 16:38:05.304147 13539 net.cpp:572] conv1a/bn <- conv1a I1106 16:38:05.304153 13539 net.cpp:527] conv1a/bn -> conv1a (in-place) I1106 16:38:05.304447 13539 net.cpp:260] Setting up conv1a/bn I1106 16:38:05.304461 13539 net.cpp:267] TRAIN Top shape for layer 4 'conv1a/bn' 4 32 160 384 (7864320) I1106 16:38:05.304473 13539 layer_factory.hpp:172] Creating layer 'conv1a/relu' of type 'ReLU' I1106 16:38:05.304481 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.304488 13539 net.cpp:200] Created Layer conv1a/relu (5) I1106 16:38:05.304493 13539 net.cpp:572] conv1a/relu <- conv1a I1106 16:38:05.304499 13539 net.cpp:527] conv1a/relu -> conv1a (in-place) I1106 16:38:05.304515 13539 net.cpp:260] Setting up conv1a/relu I1106 16:38:05.304522 13539 net.cpp:267] TRAIN Top shape for layer 5 'conv1a/relu' 4 32 160 384 (7864320) I1106 16:38:05.304528 13539 layer_factory.hpp:172] Creating layer 'conv1b' of type 'Convolution' I1106 16:38:05.304545 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.304579 13539 net.cpp:200] Created Layer conv1b (6) I1106 16:38:05.304585 13539 net.cpp:572] conv1b <- conv1a I1106 16:38:05.304591 13539 net.cpp:542] conv1b -> conv1b I1106 16:38:05.305315 13539 net.cpp:260] Setting up conv1b I1106 16:38:05.305325 13539 net.cpp:267] TRAIN Top shape for layer 6 'conv1b' 4 32 160 384 (7864320) I1106 16:38:05.305331 13539 layer_factory.hpp:172] Creating layer 'conv1b/bn' of type 'BatchNorm' I1106 16:38:05.305333 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.305341 13539 net.cpp:200] Created Layer conv1b/bn (7) I1106 16:38:05.305342 13539 net.cpp:572] conv1b/bn <- conv1b I1106 16:38:05.305346 13539 net.cpp:527] conv1b/bn -> conv1b (in-place) I1106 16:38:05.305580 13539 net.cpp:260] Setting up conv1b/bn I1106 16:38:05.305585 13539 net.cpp:267] TRAIN Top shape for layer 7 'conv1b/bn' 4 32 160 384 (7864320) I1106 16:38:05.305590 13539 layer_factory.hpp:172] Creating layer 'conv1b/relu' of type 'ReLU' I1106 16:38:05.305603 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.305610 13539 net.cpp:200] Created Layer conv1b/relu (8) I1106 16:38:05.305616 13539 net.cpp:572] conv1b/relu <- conv1b I1106 16:38:05.305622 13539 net.cpp:527] conv1b/relu -> conv1b (in-place) I1106 16:38:05.305629 13539 net.cpp:260] Setting up conv1b/relu I1106 16:38:05.305636 13539 net.cpp:267] TRAIN Top shape for layer 8 'conv1b/relu' 4 32 160 384 (7864320) I1106 16:38:05.305642 13539 layer_factory.hpp:172] Creating layer 'pool1' of type 'Pooling' I1106 16:38:05.305647 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.305658 13539 net.cpp:200] Created Layer pool1 (9) I1106 16:38:05.305665 13539 net.cpp:572] pool1 <- conv1b I1106 16:38:05.305671 13539 net.cpp:542] pool1 -> pool1 I1106 16:38:05.305716 13539 net.cpp:260] Setting up pool1 I1106 16:38:05.305721 13539 net.cpp:267] TRAIN Top shape for layer 9 'pool1' 4 32 80 192 (1966080) I1106 16:38:05.305722 13539 layer_factory.hpp:172] Creating layer 'res2a_branch2a' of type 'Convolution' I1106 16:38:05.305730 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.305742 13539 net.cpp:200] Created Layer res2a_branch2a (10) I1106 16:38:05.305749 13539 net.cpp:572] res2a_branch2a <- pool1 I1106 16:38:05.305755 13539 net.cpp:542] res2a_branch2a -> res2a_branch2a I1106 16:38:05.306567 13539 net.cpp:260] Setting up res2a_branch2a I1106 16:38:05.306584 13539 net.cpp:267] TRAIN Top shape for layer 10 'res2a_branch2a' 4 64 80 192 (3932160) I1106 16:38:05.306596 13539 layer_factory.hpp:172] Creating layer 'res2a_branch2a/bn' of type 'BatchNorm' I1106 16:38:05.306602 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.306612 13539 net.cpp:200] Created Layer res2a_branch2a/bn (11) I1106 16:38:05.306617 13539 net.cpp:572] res2a_branch2a/bn <- res2a_branch2a I1106 16:38:05.306625 13539 net.cpp:527] res2a_branch2a/bn -> res2a_branch2a (in-place) I1106 16:38:05.306840 13539 net.cpp:260] Setting up res2a_branch2a/bn I1106 16:38:05.306851 13539 net.cpp:267] TRAIN Top shape for layer 11 'res2a_branch2a/bn' 4 64 80 192 (3932160) I1106 16:38:05.306862 13539 layer_factory.hpp:172] Creating layer 'res2a_branch2a/relu' of type 'ReLU' I1106 16:38:05.306869 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.306877 13539 net.cpp:200] Created Layer res2a_branch2a/relu (12) I1106 16:38:05.306882 13539 net.cpp:572] res2a_branch2a/relu <- res2a_branch2a I1106 16:38:05.306890 13539 net.cpp:527] res2a_branch2a/relu -> res2a_branch2a (in-place) I1106 16:38:05.306896 13539 net.cpp:260] Setting up res2a_branch2a/relu I1106 16:38:05.306903 13539 net.cpp:267] TRAIN Top shape for layer 12 'res2a_branch2a/relu' 4 64 80 192 (3932160) I1106 16:38:05.306910 13539 layer_factory.hpp:172] Creating layer 'res2a_branch2b' of type 'Convolution' I1106 16:38:05.306926 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.306941 13539 net.cpp:200] Created Layer res2a_branch2b (13) I1106 16:38:05.306946 13539 net.cpp:572] res2a_branch2b <- res2a_branch2a I1106 16:38:05.306947 13539 net.cpp:542] res2a_branch2b -> res2a_branch2b I1106 16:38:05.307173 13539 net.cpp:260] Setting up res2a_branch2b I1106 16:38:05.307179 13539 net.cpp:267] TRAIN Top shape for layer 13 'res2a_branch2b' 4 64 80 192 (3932160) I1106 16:38:05.307183 13539 layer_factory.hpp:172] Creating layer 'res2a_branch2b/bn' of type 'BatchNorm' I1106 16:38:05.307193 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.307202 13539 net.cpp:200] Created Layer res2a_branch2b/bn (14) I1106 16:38:05.307209 13539 net.cpp:572] res2a_branch2b/bn <- res2a_branch2b I1106 16:38:05.307214 13539 net.cpp:527] res2a_branch2b/bn -> res2a_branch2b (in-place) I1106 16:38:05.307425 13539 net.cpp:260] Setting up res2a_branch2b/bn I1106 16:38:05.307430 13539 net.cpp:267] TRAIN Top shape for layer 14 'res2a_branch2b/bn' 4 64 80 192 (3932160) I1106 16:38:05.307435 13539 layer_factory.hpp:172] Creating layer 'res2a_branch2b/relu' of type 'ReLU' I1106 16:38:05.307444 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.307451 13539 net.cpp:200] Created Layer res2a_branch2b/relu (15) I1106 16:38:05.307456 13539 net.cpp:572] res2a_branch2b/relu <- res2a_branch2b I1106 16:38:05.307463 13539 net.cpp:527] res2a_branch2b/relu -> res2a_branch2b (in-place) I1106 16:38:05.307471 13539 net.cpp:260] Setting up res2a_branch2b/relu I1106 16:38:05.307476 13539 net.cpp:267] TRAIN Top shape for layer 15 'res2a_branch2b/relu' 4 64 80 192 (3932160) I1106 16:38:05.307482 13539 layer_factory.hpp:172] Creating layer 'pool2' of type 'Pooling' I1106 16:38:05.307488 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.307497 13539 net.cpp:200] Created Layer pool2 (16) I1106 16:38:05.307503 13539 net.cpp:572] pool2 <- res2a_branch2b I1106 16:38:05.307509 13539 net.cpp:542] pool2 -> pool2 I1106 16:38:05.307541 13539 net.cpp:260] Setting up pool2 I1106 16:38:05.307546 13539 net.cpp:267] TRAIN Top shape for layer 16 'pool2' 4 64 40 96 (983040) I1106 16:38:05.307549 13539 layer_factory.hpp:172] Creating layer 'res3a_branch2a' of type 'Convolution' I1106 16:38:05.307556 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.307569 13539 net.cpp:200] Created Layer res3a_branch2a (17) I1106 16:38:05.307576 13539 net.cpp:572] res3a_branch2a <- pool2 I1106 16:38:05.307581 13539 net.cpp:542] res3a_branch2a -> res3a_branch2a I1106 16:38:05.308228 13539 net.cpp:260] Setting up res3a_branch2a I1106 16:38:05.308236 13539 net.cpp:267] TRAIN Top shape for layer 17 'res3a_branch2a' 4 128 40 96 (1966080) I1106 16:38:05.308243 13539 layer_factory.hpp:172] Creating layer 'res3a_branch2a/bn' of type 'BatchNorm' I1106 16:38:05.308245 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.308249 13539 net.cpp:200] Created Layer res3a_branch2a/bn (18) I1106 16:38:05.308259 13539 net.cpp:572] res3a_branch2a/bn <- res3a_branch2a I1106 16:38:05.308262 13539 net.cpp:527] res3a_branch2a/bn -> res3a_branch2a (in-place) I1106 16:38:05.308441 13539 net.cpp:260] Setting up res3a_branch2a/bn I1106 16:38:05.308446 13539 net.cpp:267] TRAIN Top shape for layer 18 'res3a_branch2a/bn' 4 128 40 96 (1966080) I1106 16:38:05.308455 13539 layer_factory.hpp:172] Creating layer 'res3a_branch2a/relu' of type 'ReLU' I1106 16:38:05.308457 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.308460 13539 net.cpp:200] Created Layer res3a_branch2a/relu (19) I1106 16:38:05.308462 13539 net.cpp:572] res3a_branch2a/relu <- res3a_branch2a I1106 16:38:05.308465 13539 net.cpp:527] res3a_branch2a/relu -> res3a_branch2a (in-place) I1106 16:38:05.308475 13539 net.cpp:260] Setting up res3a_branch2a/relu I1106 16:38:05.308478 13539 net.cpp:267] TRAIN Top shape for layer 19 'res3a_branch2a/relu' 4 128 40 96 (1966080) I1106 16:38:05.308480 13539 layer_factory.hpp:172] Creating layer 'res3a_branch2b' of type 'Convolution' I1106 16:38:05.308490 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.308502 13539 net.cpp:200] Created Layer res3a_branch2b (20) I1106 16:38:05.308506 13539 net.cpp:572] res3a_branch2b <- res3a_branch2a I1106 16:38:05.308508 13539 net.cpp:542] res3a_branch2b -> res3a_branch2b I1106 16:38:05.308897 13539 net.cpp:260] Setting up res3a_branch2b I1106 16:38:05.308903 13539 net.cpp:267] TRAIN Top shape for layer 20 'res3a_branch2b' 4 128 40 96 (1966080) I1106 16:38:05.308907 13539 layer_factory.hpp:172] Creating layer 'res3a_branch2b/bn' of type 'BatchNorm' I1106 16:38:05.308910 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.308917 13539 net.cpp:200] Created Layer res3a_branch2b/bn (21) I1106 16:38:05.308919 13539 net.cpp:572] res3a_branch2b/bn <- res3a_branch2b I1106 16:38:05.308921 13539 net.cpp:527] res3a_branch2b/bn -> res3a_branch2b (in-place) I1106 16:38:05.309103 13539 net.cpp:260] Setting up res3a_branch2b/bn I1106 16:38:05.309108 13539 net.cpp:267] TRAIN Top shape for layer 21 'res3a_branch2b/bn' 4 128 40 96 (1966080) I1106 16:38:05.309114 13539 layer_factory.hpp:172] Creating layer 'res3a_branch2b/relu' of type 'ReLU' I1106 16:38:05.309118 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.309120 13539 net.cpp:200] Created Layer res3a_branch2b/relu (22) I1106 16:38:05.309123 13539 net.cpp:572] res3a_branch2b/relu <- res3a_branch2b I1106 16:38:05.309125 13539 net.cpp:527] res3a_branch2b/relu -> res3a_branch2b (in-place) I1106 16:38:05.309129 13539 net.cpp:260] Setting up res3a_branch2b/relu I1106 16:38:05.309132 13539 net.cpp:267] TRAIN Top shape for layer 22 'res3a_branch2b/relu' 4 128 40 96 (1966080) I1106 16:38:05.309135 13539 layer_factory.hpp:172] Creating layer 'pool3' of type 'Pooling' I1106 16:38:05.309137 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.309142 13539 net.cpp:200] Created Layer pool3 (23) I1106 16:38:05.309144 13539 net.cpp:572] pool3 <- res3a_branch2b I1106 16:38:05.309147 13539 net.cpp:542] pool3 -> pool3 I1106 16:38:05.309175 13539 net.cpp:260] Setting up pool3 I1106 16:38:05.309180 13539 net.cpp:267] TRAIN Top shape for layer 23 'pool3' 4 128 20 48 (491520) I1106 16:38:05.309182 13539 layer_factory.hpp:172] Creating layer 'res4a_branch2a' of type 'Convolution' I1106 16:38:05.309185 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.309193 13539 net.cpp:200] Created Layer res4a_branch2a (24) I1106 16:38:05.309196 13539 net.cpp:572] res4a_branch2a <- pool3 I1106 16:38:05.309198 13539 net.cpp:542] res4a_branch2a -> res4a_branch2a I1106 16:38:05.311892 13539 net.cpp:260] Setting up res4a_branch2a I1106 16:38:05.311908 13539 net.cpp:267] TRAIN Top shape for layer 24 'res4a_branch2a' 4 256 20 48 (983040) I1106 16:38:05.311913 13539 layer_factory.hpp:172] Creating layer 'res4a_branch2a/bn' of type 'BatchNorm' I1106 16:38:05.311918 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.311925 13539 net.cpp:200] Created Layer res4a_branch2a/bn (25) I1106 16:38:05.311928 13539 net.cpp:572] res4a_branch2a/bn <- res4a_branch2a I1106 16:38:05.311933 13539 net.cpp:527] res4a_branch2a/bn -> res4a_branch2a (in-place) I1106 16:38:05.312136 13539 net.cpp:260] Setting up res4a_branch2a/bn I1106 16:38:05.312142 13539 net.cpp:267] TRAIN Top shape for layer 25 'res4a_branch2a/bn' 4 256 20 48 (983040) I1106 16:38:05.312150 13539 layer_factory.hpp:172] Creating layer 'res4a_branch2a/relu' of type 'ReLU' I1106 16:38:05.312152 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.312170 13539 net.cpp:200] Created Layer res4a_branch2a/relu (26) I1106 16:38:05.312175 13539 net.cpp:572] res4a_branch2a/relu <- res4a_branch2a I1106 16:38:05.312176 13539 net.cpp:527] res4a_branch2a/relu -> res4a_branch2a (in-place) I1106 16:38:05.312182 13539 net.cpp:260] Setting up res4a_branch2a/relu I1106 16:38:05.312186 13539 net.cpp:267] TRAIN Top shape for layer 26 'res4a_branch2a/relu' 4 256 20 48 (983040) I1106 16:38:05.312188 13539 layer_factory.hpp:172] Creating layer 'res4a_branch2b' of type 'Convolution' I1106 16:38:05.312192 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.312203 13539 net.cpp:200] Created Layer res4a_branch2b (27) I1106 16:38:05.312206 13539 net.cpp:572] res4a_branch2b <- res4a_branch2a I1106 16:38:05.312209 13539 net.cpp:542] res4a_branch2b -> res4a_branch2b I1106 16:38:05.313380 13539 net.cpp:260] Setting up res4a_branch2b I1106 16:38:05.313386 13539 net.cpp:267] TRAIN Top shape for layer 27 'res4a_branch2b' 4 256 20 48 (983040) I1106 16:38:05.313390 13539 layer_factory.hpp:172] Creating layer 'res4a_branch2b/bn' of type 'BatchNorm' I1106 16:38:05.313393 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.313398 13539 net.cpp:200] Created Layer res4a_branch2b/bn (28) I1106 16:38:05.313401 13539 net.cpp:572] res4a_branch2b/bn <- res4a_branch2b I1106 16:38:05.313403 13539 net.cpp:527] res4a_branch2b/bn -> res4a_branch2b (in-place) I1106 16:38:05.313601 13539 net.cpp:260] Setting up res4a_branch2b/bn I1106 16:38:05.313607 13539 net.cpp:267] TRAIN Top shape for layer 28 'res4a_branch2b/bn' 4 256 20 48 (983040) I1106 16:38:05.313613 13539 layer_factory.hpp:172] Creating layer 'res4a_branch2b/relu' of type 'ReLU' I1106 16:38:05.313616 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.313621 13539 net.cpp:200] Created Layer res4a_branch2b/relu (29) I1106 16:38:05.313622 13539 net.cpp:572] res4a_branch2b/relu <- res4a_branch2b I1106 16:38:05.313625 13539 net.cpp:527] res4a_branch2b/relu -> res4a_branch2b (in-place) I1106 16:38:05.313628 13539 net.cpp:260] Setting up res4a_branch2b/relu I1106 16:38:05.313632 13539 net.cpp:267] TRAIN Top shape for layer 29 'res4a_branch2b/relu' 4 256 20 48 (983040) I1106 16:38:05.313634 13539 layer_factory.hpp:172] Creating layer 'res4a_branch2b_res4a_branch2b/relu_0_split' of type 'Split' I1106 16:38:05.313637 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.313642 13539 net.cpp:200] Created Layer res4a_branch2b_res4a_branch2b/relu_0_split (30) I1106 16:38:05.313644 13539 net.cpp:572] res4a_branch2b_res4a_branch2b/relu_0_split <- res4a_branch2b I1106 16:38:05.313647 13539 net.cpp:542] res4a_branch2b_res4a_branch2b/relu_0_split -> res4a_branch2b_res4a_branch2b/relu_0_split_0 I1106 16:38:05.313652 13539 net.cpp:542] res4a_branch2b_res4a_branch2b/relu_0_split -> res4a_branch2b_res4a_branch2b/relu_0_split_1 I1106 16:38:05.313676 13539 net.cpp:260] Setting up res4a_branch2b_res4a_branch2b/relu_0_split I1106 16:38:05.313680 13539 net.cpp:267] TRAIN Top shape for layer 30 'res4a_branch2b_res4a_branch2b/relu_0_split' 4 256 20 48 (983040) I1106 16:38:05.313684 13539 net.cpp:267] TRAIN Top shape for layer 30 'res4a_branch2b_res4a_branch2b/relu_0_split' 4 256 20 48 (983040) I1106 16:38:05.313686 13539 layer_factory.hpp:172] Creating layer 'pool4' of type 'Pooling' I1106 16:38:05.313689 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.313697 13539 net.cpp:200] Created Layer pool4 (31) I1106 16:38:05.313700 13539 net.cpp:572] pool4 <- res4a_branch2b_res4a_branch2b/relu_0_split_0 I1106 16:38:05.313704 13539 net.cpp:542] pool4 -> pool4 I1106 16:38:05.313735 13539 net.cpp:260] Setting up pool4 I1106 16:38:05.313740 13539 net.cpp:267] TRAIN Top shape for layer 31 'pool4' 4 256 10 24 (245760) I1106 16:38:05.313741 13539 layer_factory.hpp:172] Creating layer 'res5a_branch2a' of type 'Convolution' I1106 16:38:05.313750 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.313762 13539 net.cpp:200] Created Layer res5a_branch2a (32) I1106 16:38:05.313766 13539 net.cpp:572] res5a_branch2a <- pool4 I1106 16:38:05.313769 13539 net.cpp:542] res5a_branch2a -> res5a_branch2a I1106 16:38:05.323120 13539 net.cpp:260] Setting up res5a_branch2a I1106 16:38:05.323210 13539 net.cpp:267] TRAIN Top shape for layer 32 'res5a_branch2a' 4 512 10 24 (491520) I1106 16:38:05.323228 13539 layer_factory.hpp:172] Creating layer 'res5a_branch2a/bn' of type 'BatchNorm' I1106 16:38:05.323241 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.323261 13539 net.cpp:200] Created Layer res5a_branch2a/bn (33) I1106 16:38:05.323271 13539 net.cpp:572] res5a_branch2a/bn <- res5a_branch2a I1106 16:38:05.323280 13539 net.cpp:527] res5a_branch2a/bn -> res5a_branch2a (in-place) I1106 16:38:05.323527 13539 net.cpp:260] Setting up res5a_branch2a/bn I1106 16:38:05.323534 13539 net.cpp:267] TRAIN Top shape for layer 33 'res5a_branch2a/bn' 4 512 10 24 (491520) I1106 16:38:05.323539 13539 layer_factory.hpp:172] Creating layer 'res5a_branch2a/relu' of type 'ReLU' I1106 16:38:05.323542 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.323549 13539 net.cpp:200] Created Layer res5a_branch2a/relu (34) I1106 16:38:05.323554 13539 net.cpp:572] res5a_branch2a/relu <- res5a_branch2a I1106 16:38:05.323555 13539 net.cpp:527] res5a_branch2a/relu -> res5a_branch2a (in-place) I1106 16:38:05.323561 13539 net.cpp:260] Setting up res5a_branch2a/relu I1106 16:38:05.323565 13539 net.cpp:267] TRAIN Top shape for layer 34 'res5a_branch2a/relu' 4 512 10 24 (491520) I1106 16:38:05.323566 13539 layer_factory.hpp:172] Creating layer 'res5a_branch2b' of type 'Convolution' I1106 16:38:05.323570 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.323590 13539 net.cpp:200] Created Layer res5a_branch2b (35) I1106 16:38:05.323595 13539 net.cpp:572] res5a_branch2b <- res5a_branch2a I1106 16:38:05.323596 13539 net.cpp:542] res5a_branch2b -> res5a_branch2b I1106 16:38:05.328366 13539 net.cpp:260] Setting up res5a_branch2b I1106 16:38:05.328392 13539 net.cpp:267] TRAIN Top shape for layer 35 'res5a_branch2b' 4 512 10 24 (491520) I1106 16:38:05.328408 13539 layer_factory.hpp:172] Creating layer 'res5a_branch2b/bn' of type 'BatchNorm' I1106 16:38:05.328418 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.328428 13539 net.cpp:200] Created Layer res5a_branch2b/bn (36) I1106 16:38:05.328435 13539 net.cpp:572] res5a_branch2b/bn <- res5a_branch2b I1106 16:38:05.328442 13539 net.cpp:527] res5a_branch2b/bn -> res5a_branch2b (in-place) I1106 16:38:05.328652 13539 net.cpp:260] Setting up res5a_branch2b/bn I1106 16:38:05.328662 13539 net.cpp:267] TRAIN Top shape for layer 36 'res5a_branch2b/bn' 4 512 10 24 (491520) I1106 16:38:05.328672 13539 layer_factory.hpp:172] Creating layer 'res5a_branch2b/relu' of type 'ReLU' I1106 16:38:05.328680 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.328687 13539 net.cpp:200] Created Layer res5a_branch2b/relu (37) I1106 16:38:05.328694 13539 net.cpp:572] res5a_branch2b/relu <- res5a_branch2b I1106 16:38:05.328701 13539 net.cpp:527] res5a_branch2b/relu -> res5a_branch2b (in-place) I1106 16:38:05.328709 13539 net.cpp:260] Setting up res5a_branch2b/relu I1106 16:38:05.328717 13539 net.cpp:267] TRAIN Top shape for layer 37 'res5a_branch2b/relu' 4 512 10 24 (491520) I1106 16:38:05.328723 13539 layer_factory.hpp:172] Creating layer 'res5a_branch2b_res5a_branch2b/relu_0_split' of type 'Split' I1106 16:38:05.328729 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.328742 13539 net.cpp:200] Created Layer res5a_branch2b_res5a_branch2b/relu_0_split (38) I1106 16:38:05.328758 13539 net.cpp:572] res5a_branch2b_res5a_branch2b/relu_0_split <- res5a_branch2b I1106 16:38:05.328765 13539 net.cpp:542] res5a_branch2b_res5a_branch2b/relu_0_split -> res5a_branch2b_res5a_branch2b/relu_0_split_0 I1106 16:38:05.328774 13539 net.cpp:542] res5a_branch2b_res5a_branch2b/relu_0_split -> res5a_branch2b_res5a_branch2b/relu_0_split_1 I1106 16:38:05.328804 13539 net.cpp:260] Setting up res5a_branch2b_res5a_branch2b/relu_0_split I1106 16:38:05.328812 13539 net.cpp:267] TRAIN Top shape for layer 38 'res5a_branch2b_res5a_branch2b/relu_0_split' 4 512 10 24 (491520) I1106 16:38:05.328819 13539 net.cpp:267] TRAIN Top shape for layer 38 'res5a_branch2b_res5a_branch2b/relu_0_split' 4 512 10 24 (491520) I1106 16:38:05.328825 13539 layer_factory.hpp:172] Creating layer 'pool6' of type 'Pooling' I1106 16:38:05.328832 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.328842 13539 net.cpp:200] Created Layer pool6 (39) I1106 16:38:05.328848 13539 net.cpp:572] pool6 <- res5a_branch2b_res5a_branch2b/relu_0_split_0 I1106 16:38:05.328855 13539 net.cpp:542] pool6 -> pool6 I1106 16:38:05.328891 13539 net.cpp:260] Setting up pool6 I1106 16:38:05.328900 13539 net.cpp:267] TRAIN Top shape for layer 39 'pool6' 4 512 5 12 (122880) I1106 16:38:05.328907 13539 layer_factory.hpp:172] Creating layer 'pool6_pool6_0_split' of type 'Split' I1106 16:38:05.328912 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.328920 13539 net.cpp:200] Created Layer pool6_pool6_0_split (40) I1106 16:38:05.328925 13539 net.cpp:572] pool6_pool6_0_split <- pool6 I1106 16:38:05.328933 13539 net.cpp:542] pool6_pool6_0_split -> pool6_pool6_0_split_0 I1106 16:38:05.328940 13539 net.cpp:542] pool6_pool6_0_split -> pool6_pool6_0_split_1 I1106 16:38:05.328967 13539 net.cpp:260] Setting up pool6_pool6_0_split I1106 16:38:05.328975 13539 net.cpp:267] TRAIN Top shape for layer 40 'pool6_pool6_0_split' 4 512 5 12 (122880) I1106 16:38:05.328982 13539 net.cpp:267] TRAIN Top shape for layer 40 'pool6_pool6_0_split' 4 512 5 12 (122880) I1106 16:38:05.328989 13539 layer_factory.hpp:172] Creating layer 'pool7' of type 'Pooling' I1106 16:38:05.328995 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.329003 13539 net.cpp:200] Created Layer pool7 (41) I1106 16:38:05.329010 13539 net.cpp:572] pool7 <- pool6_pool6_0_split_0 I1106 16:38:05.329016 13539 net.cpp:542] pool7 -> pool7 I1106 16:38:05.329048 13539 net.cpp:260] Setting up pool7 I1106 16:38:05.329057 13539 net.cpp:267] TRAIN Top shape for layer 41 'pool7' 4 512 3 6 (36864) I1106 16:38:05.329064 13539 layer_factory.hpp:172] Creating layer 'pool7_pool7_0_split' of type 'Split' I1106 16:38:05.329071 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.329077 13539 net.cpp:200] Created Layer pool7_pool7_0_split (42) I1106 16:38:05.329083 13539 net.cpp:572] pool7_pool7_0_split <- pool7 I1106 16:38:05.329089 13539 net.cpp:542] pool7_pool7_0_split -> pool7_pool7_0_split_0 I1106 16:38:05.329097 13539 net.cpp:542] pool7_pool7_0_split -> pool7_pool7_0_split_1 I1106 16:38:05.329121 13539 net.cpp:260] Setting up pool7_pool7_0_split I1106 16:38:05.329126 13539 net.cpp:267] TRAIN Top shape for layer 42 'pool7_pool7_0_split' 4 512 3 6 (36864) I1106 16:38:05.329128 13539 net.cpp:267] TRAIN Top shape for layer 42 'pool7_pool7_0_split' 4 512 3 6 (36864) I1106 16:38:05.329130 13539 layer_factory.hpp:172] Creating layer 'pool8' of type 'Pooling' I1106 16:38:05.329138 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.329147 13539 net.cpp:200] Created Layer pool8 (43) I1106 16:38:05.329151 13539 net.cpp:572] pool8 <- pool7_pool7_0_split_0 I1106 16:38:05.329154 13539 net.cpp:542] pool8 -> pool8 I1106 16:38:05.329190 13539 net.cpp:260] Setting up pool8 I1106 16:38:05.329195 13539 net.cpp:267] TRAIN Top shape for layer 43 'pool8' 4 512 2 3 (12288) I1106 16:38:05.329205 13539 layer_factory.hpp:172] Creating layer 'ctx_output1' of type 'Convolution' I1106 16:38:05.329212 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.329224 13539 net.cpp:200] Created Layer ctx_output1 (44) I1106 16:38:05.329227 13539 net.cpp:572] ctx_output1 <- res4a_branch2b_res4a_branch2b/relu_0_split_1 I1106 16:38:05.329231 13539 net.cpp:542] ctx_output1 -> ctx_output1 I1106 16:38:05.329855 13539 net.cpp:260] Setting up ctx_output1 I1106 16:38:05.329864 13539 net.cpp:267] TRAIN Top shape for layer 44 'ctx_output1' 4 256 20 48 (983040) I1106 16:38:05.329869 13539 layer_factory.hpp:172] Creating layer 'ctx_output1/relu' of type 'ReLU' I1106 16:38:05.329870 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.329874 13539 net.cpp:200] Created Layer ctx_output1/relu (45) I1106 16:38:05.329879 13539 net.cpp:572] ctx_output1/relu <- ctx_output1 I1106 16:38:05.329880 13539 net.cpp:527] ctx_output1/relu -> ctx_output1 (in-place) I1106 16:38:05.329885 13539 net.cpp:260] Setting up ctx_output1/relu I1106 16:38:05.329888 13539 net.cpp:267] TRAIN Top shape for layer 45 'ctx_output1/relu' 4 256 20 48 (983040) I1106 16:38:05.329891 13539 layer_factory.hpp:172] Creating layer 'ctx_output1_ctx_output1/relu_0_split' of type 'Split' I1106 16:38:05.329895 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.329897 13539 net.cpp:200] Created Layer ctx_output1_ctx_output1/relu_0_split (46) I1106 16:38:05.329900 13539 net.cpp:572] ctx_output1_ctx_output1/relu_0_split <- ctx_output1 I1106 16:38:05.329903 13539 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_0 I1106 16:38:05.329908 13539 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_1 I1106 16:38:05.329911 13539 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_2 I1106 16:38:05.329946 13539 net.cpp:260] Setting up ctx_output1_ctx_output1/relu_0_split I1106 16:38:05.329952 13539 net.cpp:267] TRAIN Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 4 256 20 48 (983040) I1106 16:38:05.329953 13539 net.cpp:267] TRAIN Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 4 256 20 48 (983040) I1106 16:38:05.329957 13539 net.cpp:267] TRAIN Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 4 256 20 48 (983040) I1106 16:38:05.329960 13539 layer_factory.hpp:172] Creating layer 'ctx_output2' of type 'Convolution' I1106 16:38:05.329963 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.329972 13539 net.cpp:200] Created Layer ctx_output2 (47) I1106 16:38:05.329977 13539 net.cpp:572] ctx_output2 <- res5a_branch2b_res5a_branch2b/relu_0_split_1 I1106 16:38:05.329979 13539 net.cpp:542] ctx_output2 -> ctx_output2 I1106 16:38:05.331041 13539 net.cpp:260] Setting up ctx_output2 I1106 16:38:05.331048 13539 net.cpp:267] TRAIN Top shape for layer 47 'ctx_output2' 4 256 10 24 (245760) I1106 16:38:05.331053 13539 layer_factory.hpp:172] Creating layer 'ctx_output2/relu' of type 'ReLU' I1106 16:38:05.331064 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.331069 13539 net.cpp:200] Created Layer ctx_output2/relu (48) I1106 16:38:05.331073 13539 net.cpp:572] ctx_output2/relu <- ctx_output2 I1106 16:38:05.331079 13539 net.cpp:527] ctx_output2/relu -> ctx_output2 (in-place) I1106 16:38:05.331085 13539 net.cpp:260] Setting up ctx_output2/relu I1106 16:38:05.331089 13539 net.cpp:267] TRAIN Top shape for layer 48 'ctx_output2/relu' 4 256 10 24 (245760) I1106 16:38:05.331091 13539 layer_factory.hpp:172] Creating layer 'ctx_output2_ctx_output2/relu_0_split' of type 'Split' I1106 16:38:05.331094 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.331097 13539 net.cpp:200] Created Layer ctx_output2_ctx_output2/relu_0_split (49) I1106 16:38:05.331106 13539 net.cpp:572] ctx_output2_ctx_output2/relu_0_split <- ctx_output2 I1106 16:38:05.331110 13539 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_0 I1106 16:38:05.331112 13539 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_1 I1106 16:38:05.331115 13539 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_2 I1106 16:38:05.331146 13539 net.cpp:260] Setting up ctx_output2_ctx_output2/relu_0_split I1106 16:38:05.331151 13539 net.cpp:267] TRAIN Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 4 256 10 24 (245760) I1106 16:38:05.331152 13539 net.cpp:267] TRAIN Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 4 256 10 24 (245760) I1106 16:38:05.331156 13539 net.cpp:267] TRAIN Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 4 256 10 24 (245760) I1106 16:38:05.331157 13539 layer_factory.hpp:172] Creating layer 'ctx_output3' of type 'Convolution' I1106 16:38:05.331161 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.331171 13539 net.cpp:200] Created Layer ctx_output3 (50) I1106 16:38:05.331173 13539 net.cpp:572] ctx_output3 <- pool6_pool6_0_split_1 I1106 16:38:05.331176 13539 net.cpp:542] ctx_output3 -> ctx_output3 I1106 16:38:05.332711 13539 net.cpp:260] Setting up ctx_output3 I1106 16:38:05.332722 13539 net.cpp:267] TRAIN Top shape for layer 50 'ctx_output3' 4 256 5 12 (61440) I1106 16:38:05.332727 13539 layer_factory.hpp:172] Creating layer 'ctx_output3/relu' of type 'ReLU' I1106 16:38:05.332738 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.332744 13539 net.cpp:200] Created Layer ctx_output3/relu (51) I1106 16:38:05.332748 13539 net.cpp:572] ctx_output3/relu <- ctx_output3 I1106 16:38:05.332751 13539 net.cpp:527] ctx_output3/relu -> ctx_output3 (in-place) I1106 16:38:05.332762 13539 net.cpp:260] Setting up ctx_output3/relu I1106 16:38:05.332765 13539 net.cpp:267] TRAIN Top shape for layer 51 'ctx_output3/relu' 4 256 5 12 (61440) I1106 16:38:05.332768 13539 layer_factory.hpp:172] Creating layer 'ctx_output3_ctx_output3/relu_0_split' of type 'Split' I1106 16:38:05.332770 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.332780 13539 net.cpp:200] Created Layer ctx_output3_ctx_output3/relu_0_split (52) I1106 16:38:05.332783 13539 net.cpp:572] ctx_output3_ctx_output3/relu_0_split <- ctx_output3 I1106 16:38:05.332787 13539 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_0 I1106 16:38:05.332793 13539 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_1 I1106 16:38:05.332801 13539 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_2 I1106 16:38:05.332844 13539 net.cpp:260] Setting up ctx_output3_ctx_output3/relu_0_split I1106 16:38:05.332847 13539 net.cpp:267] TRAIN Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 4 256 5 12 (61440) I1106 16:38:05.332856 13539 net.cpp:267] TRAIN Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 4 256 5 12 (61440) I1106 16:38:05.332859 13539 net.cpp:267] TRAIN Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 4 256 5 12 (61440) I1106 16:38:05.332862 13539 layer_factory.hpp:172] Creating layer 'ctx_output4' of type 'Convolution' I1106 16:38:05.332868 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.332878 13539 net.cpp:200] Created Layer ctx_output4 (53) I1106 16:38:05.332881 13539 net.cpp:572] ctx_output4 <- pool7_pool7_0_split_1 I1106 16:38:05.332885 13539 net.cpp:542] ctx_output4 -> ctx_output4 I1106 16:38:05.333933 13539 net.cpp:260] Setting up ctx_output4 I1106 16:38:05.333940 13539 net.cpp:267] TRAIN Top shape for layer 53 'ctx_output4' 4 256 3 6 (18432) I1106 16:38:05.333953 13539 layer_factory.hpp:172] Creating layer 'ctx_output4/relu' of type 'ReLU' I1106 16:38:05.333956 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.333959 13539 net.cpp:200] Created Layer ctx_output4/relu (54) I1106 16:38:05.333961 13539 net.cpp:572] ctx_output4/relu <- ctx_output4 I1106 16:38:05.333964 13539 net.cpp:527] ctx_output4/relu -> ctx_output4 (in-place) I1106 16:38:05.333967 13539 net.cpp:260] Setting up ctx_output4/relu I1106 16:38:05.333971 13539 net.cpp:267] TRAIN Top shape for layer 54 'ctx_output4/relu' 4 256 3 6 (18432) I1106 16:38:05.333972 13539 layer_factory.hpp:172] Creating layer 'ctx_output4_ctx_output4/relu_0_split' of type 'Split' I1106 16:38:05.333974 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.333979 13539 net.cpp:200] Created Layer ctx_output4_ctx_output4/relu_0_split (55) I1106 16:38:05.333982 13539 net.cpp:572] ctx_output4_ctx_output4/relu_0_split <- ctx_output4 I1106 16:38:05.333984 13539 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_0 I1106 16:38:05.333988 13539 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_1 I1106 16:38:05.333992 13539 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_2 I1106 16:38:05.334024 13539 net.cpp:260] Setting up ctx_output4_ctx_output4/relu_0_split I1106 16:38:05.334028 13539 net.cpp:267] TRAIN Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 4 256 3 6 (18432) I1106 16:38:05.334031 13539 net.cpp:267] TRAIN Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 4 256 3 6 (18432) I1106 16:38:05.334033 13539 net.cpp:267] TRAIN Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 4 256 3 6 (18432) I1106 16:38:05.334036 13539 layer_factory.hpp:172] Creating layer 'ctx_output5' of type 'Convolution' I1106 16:38:05.334038 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.334046 13539 net.cpp:200] Created Layer ctx_output5 (56) I1106 16:38:05.334049 13539 net.cpp:572] ctx_output5 <- pool8 I1106 16:38:05.334053 13539 net.cpp:542] ctx_output5 -> ctx_output5 I1106 16:38:05.335086 13539 net.cpp:260] Setting up ctx_output5 I1106 16:38:05.335093 13539 net.cpp:267] TRAIN Top shape for layer 56 'ctx_output5' 4 256 2 3 (6144) I1106 16:38:05.335098 13539 layer_factory.hpp:172] Creating layer 'ctx_output5/relu' of type 'ReLU' I1106 16:38:05.335100 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.335104 13539 net.cpp:200] Created Layer ctx_output5/relu (57) I1106 16:38:05.335106 13539 net.cpp:572] ctx_output5/relu <- ctx_output5 I1106 16:38:05.335109 13539 net.cpp:527] ctx_output5/relu -> ctx_output5 (in-place) I1106 16:38:05.335114 13539 net.cpp:260] Setting up ctx_output5/relu I1106 16:38:05.335117 13539 net.cpp:267] TRAIN Top shape for layer 57 'ctx_output5/relu' 4 256 2 3 (6144) I1106 16:38:05.335120 13539 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc' of type 'Convolution' I1106 16:38:05.335122 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.335130 13539 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc (58) I1106 16:38:05.335134 13539 net.cpp:572] ctx_output1/relu_mbox_loc <- ctx_output1_ctx_output1/relu_0_split_0 I1106 16:38:05.335136 13539 net.cpp:542] ctx_output1/relu_mbox_loc -> ctx_output1/relu_mbox_loc I1106 16:38:05.335306 13539 net.cpp:260] Setting up ctx_output1/relu_mbox_loc I1106 16:38:05.335311 13539 net.cpp:267] TRAIN Top shape for layer 58 'ctx_output1/relu_mbox_loc' 4 16 20 48 (61440) I1106 16:38:05.335316 13539 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:05.335320 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.335330 13539 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc_perm (59) I1106 16:38:05.335338 13539 net.cpp:572] ctx_output1/relu_mbox_loc_perm <- ctx_output1/relu_mbox_loc I1106 16:38:05.335341 13539 net.cpp:542] ctx_output1/relu_mbox_loc_perm -> ctx_output1/relu_mbox_loc_perm I1106 16:38:05.335402 13539 net.cpp:260] Setting up ctx_output1/relu_mbox_loc_perm I1106 16:38:05.335407 13539 net.cpp:267] TRAIN Top shape for layer 59 'ctx_output1/relu_mbox_loc_perm' 4 20 48 16 (61440) I1106 16:38:05.335410 13539 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:05.335413 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.335418 13539 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc_flat (60) I1106 16:38:05.335422 13539 net.cpp:572] ctx_output1/relu_mbox_loc_flat <- ctx_output1/relu_mbox_loc_perm I1106 16:38:05.335423 13539 net.cpp:542] ctx_output1/relu_mbox_loc_flat -> ctx_output1/relu_mbox_loc_flat I1106 16:38:05.335477 13539 net.cpp:260] Setting up ctx_output1/relu_mbox_loc_flat I1106 16:38:05.335482 13539 net.cpp:267] TRAIN Top shape for layer 60 'ctx_output1/relu_mbox_loc_flat' 4 15360 (61440) I1106 16:38:05.335485 13539 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf' of type 'Convolution' I1106 16:38:05.335487 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.335496 13539 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf (61) I1106 16:38:05.335500 13539 net.cpp:572] ctx_output1/relu_mbox_conf <- ctx_output1_ctx_output1/relu_0_split_1 I1106 16:38:05.335502 13539 net.cpp:542] ctx_output1/relu_mbox_conf -> ctx_output1/relu_mbox_conf I1106 16:38:05.335659 13539 net.cpp:260] Setting up ctx_output1/relu_mbox_conf I1106 16:38:05.335665 13539 net.cpp:267] TRAIN Top shape for layer 61 'ctx_output1/relu_mbox_conf' 4 8 20 48 (30720) I1106 16:38:05.335670 13539 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:05.335674 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.335682 13539 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf_perm (62) I1106 16:38:05.335686 13539 net.cpp:572] ctx_output1/relu_mbox_conf_perm <- ctx_output1/relu_mbox_conf I1106 16:38:05.335690 13539 net.cpp:542] ctx_output1/relu_mbox_conf_perm -> ctx_output1/relu_mbox_conf_perm I1106 16:38:05.335743 13539 net.cpp:260] Setting up ctx_output1/relu_mbox_conf_perm I1106 16:38:05.335748 13539 net.cpp:267] TRAIN Top shape for layer 62 'ctx_output1/relu_mbox_conf_perm' 4 20 48 8 (30720) I1106 16:38:05.335750 13539 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:05.335753 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.335757 13539 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf_flat (63) I1106 16:38:05.335759 13539 net.cpp:572] ctx_output1/relu_mbox_conf_flat <- ctx_output1/relu_mbox_conf_perm I1106 16:38:05.335762 13539 net.cpp:542] ctx_output1/relu_mbox_conf_flat -> ctx_output1/relu_mbox_conf_flat I1106 16:38:05.335799 13539 net.cpp:260] Setting up ctx_output1/relu_mbox_conf_flat I1106 16:38:05.335803 13539 net.cpp:267] TRAIN Top shape for layer 63 'ctx_output1/relu_mbox_conf_flat' 4 7680 (30720) I1106 16:38:05.335806 13539 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:05.335808 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.335819 13539 net.cpp:200] Created Layer ctx_output1/relu_mbox_priorbox (64) I1106 16:38:05.335820 13539 net.cpp:572] ctx_output1/relu_mbox_priorbox <- ctx_output1_ctx_output1/relu_0_split_2 I1106 16:38:05.335824 13539 net.cpp:572] ctx_output1/relu_mbox_priorbox <- data_data_0_split_1 I1106 16:38:05.335826 13539 net.cpp:542] ctx_output1/relu_mbox_priorbox -> ctx_output1/relu_mbox_priorbox I1106 16:38:05.335839 13539 net.cpp:260] Setting up ctx_output1/relu_mbox_priorbox I1106 16:38:05.335850 13539 net.cpp:267] TRAIN Top shape for layer 64 'ctx_output1/relu_mbox_priorbox' 1 2 15360 (30720) I1106 16:38:05.335853 13539 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc' of type 'Convolution' I1106 16:38:05.335855 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.335865 13539 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc (65) I1106 16:38:05.335868 13539 net.cpp:572] ctx_output2/relu_mbox_loc <- ctx_output2_ctx_output2/relu_0_split_0 I1106 16:38:05.335870 13539 net.cpp:542] ctx_output2/relu_mbox_loc -> ctx_output2/relu_mbox_loc I1106 16:38:05.336072 13539 net.cpp:260] Setting up ctx_output2/relu_mbox_loc I1106 16:38:05.336078 13539 net.cpp:267] TRAIN Top shape for layer 65 'ctx_output2/relu_mbox_loc' 4 24 10 24 (23040) I1106 16:38:05.336082 13539 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:05.336086 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.336091 13539 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc_perm (66) I1106 16:38:05.336093 13539 net.cpp:572] ctx_output2/relu_mbox_loc_perm <- ctx_output2/relu_mbox_loc I1106 16:38:05.336095 13539 net.cpp:542] ctx_output2/relu_mbox_loc_perm -> ctx_output2/relu_mbox_loc_perm I1106 16:38:05.336156 13539 net.cpp:260] Setting up ctx_output2/relu_mbox_loc_perm I1106 16:38:05.336160 13539 net.cpp:267] TRAIN Top shape for layer 66 'ctx_output2/relu_mbox_loc_perm' 4 10 24 24 (23040) I1106 16:38:05.336163 13539 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:05.336165 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.336170 13539 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc_flat (67) I1106 16:38:05.336172 13539 net.cpp:572] ctx_output2/relu_mbox_loc_flat <- ctx_output2/relu_mbox_loc_perm I1106 16:38:05.336175 13539 net.cpp:542] ctx_output2/relu_mbox_loc_flat -> ctx_output2/relu_mbox_loc_flat I1106 16:38:05.336208 13539 net.cpp:260] Setting up ctx_output2/relu_mbox_loc_flat I1106 16:38:05.336212 13539 net.cpp:267] TRAIN Top shape for layer 67 'ctx_output2/relu_mbox_loc_flat' 4 5760 (23040) I1106 16:38:05.336215 13539 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf' of type 'Convolution' I1106 16:38:05.336217 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.336226 13539 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf (68) I1106 16:38:05.336228 13539 net.cpp:572] ctx_output2/relu_mbox_conf <- ctx_output2_ctx_output2/relu_0_split_1 I1106 16:38:05.336231 13539 net.cpp:542] ctx_output2/relu_mbox_conf -> ctx_output2/relu_mbox_conf I1106 16:38:05.336397 13539 net.cpp:260] Setting up ctx_output2/relu_mbox_conf I1106 16:38:05.336403 13539 net.cpp:267] TRAIN Top shape for layer 68 'ctx_output2/relu_mbox_conf' 4 12 10 24 (11520) I1106 16:38:05.336407 13539 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:05.336410 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.336417 13539 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf_perm (69) I1106 16:38:05.336421 13539 net.cpp:572] ctx_output2/relu_mbox_conf_perm <- ctx_output2/relu_mbox_conf I1106 16:38:05.336424 13539 net.cpp:542] ctx_output2/relu_mbox_conf_perm -> ctx_output2/relu_mbox_conf_perm I1106 16:38:05.336484 13539 net.cpp:260] Setting up ctx_output2/relu_mbox_conf_perm I1106 16:38:05.336489 13539 net.cpp:267] TRAIN Top shape for layer 69 'ctx_output2/relu_mbox_conf_perm' 4 10 24 12 (11520) I1106 16:38:05.336493 13539 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:05.336495 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.336499 13539 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf_flat (70) I1106 16:38:05.336508 13539 net.cpp:572] ctx_output2/relu_mbox_conf_flat <- ctx_output2/relu_mbox_conf_perm I1106 16:38:05.336513 13539 net.cpp:542] ctx_output2/relu_mbox_conf_flat -> ctx_output2/relu_mbox_conf_flat I1106 16:38:05.336549 13539 net.cpp:260] Setting up ctx_output2/relu_mbox_conf_flat I1106 16:38:05.336553 13539 net.cpp:267] TRAIN Top shape for layer 70 'ctx_output2/relu_mbox_conf_flat' 4 2880 (11520) I1106 16:38:05.336556 13539 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:05.336560 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.336565 13539 net.cpp:200] Created Layer ctx_output2/relu_mbox_priorbox (71) I1106 16:38:05.336567 13539 net.cpp:572] ctx_output2/relu_mbox_priorbox <- ctx_output2_ctx_output2/relu_0_split_2 I1106 16:38:05.336571 13539 net.cpp:572] ctx_output2/relu_mbox_priorbox <- data_data_0_split_2 I1106 16:38:05.336575 13539 net.cpp:542] ctx_output2/relu_mbox_priorbox -> ctx_output2/relu_mbox_priorbox I1106 16:38:05.336587 13539 net.cpp:260] Setting up ctx_output2/relu_mbox_priorbox I1106 16:38:05.336591 13539 net.cpp:267] TRAIN Top shape for layer 71 'ctx_output2/relu_mbox_priorbox' 1 2 5760 (11520) I1106 16:38:05.336594 13539 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc' of type 'Convolution' I1106 16:38:05.336597 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.336604 13539 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc (72) I1106 16:38:05.336607 13539 net.cpp:572] ctx_output3/relu_mbox_loc <- ctx_output3_ctx_output3/relu_0_split_0 I1106 16:38:05.336611 13539 net.cpp:542] ctx_output3/relu_mbox_loc -> ctx_output3/relu_mbox_loc I1106 16:38:05.336802 13539 net.cpp:260] Setting up ctx_output3/relu_mbox_loc I1106 16:38:05.336808 13539 net.cpp:267] TRAIN Top shape for layer 72 'ctx_output3/relu_mbox_loc' 4 24 5 12 (5760) I1106 16:38:05.336813 13539 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:05.336817 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.336822 13539 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc_perm (73) I1106 16:38:05.336824 13539 net.cpp:572] ctx_output3/relu_mbox_loc_perm <- ctx_output3/relu_mbox_loc I1106 16:38:05.336827 13539 net.cpp:542] ctx_output3/relu_mbox_loc_perm -> ctx_output3/relu_mbox_loc_perm I1106 16:38:05.336882 13539 net.cpp:260] Setting up ctx_output3/relu_mbox_loc_perm I1106 16:38:05.336887 13539 net.cpp:267] TRAIN Top shape for layer 73 'ctx_output3/relu_mbox_loc_perm' 4 5 12 24 (5760) I1106 16:38:05.336890 13539 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:05.336894 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.336897 13539 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc_flat (74) I1106 16:38:05.336899 13539 net.cpp:572] ctx_output3/relu_mbox_loc_flat <- ctx_output3/relu_mbox_loc_perm I1106 16:38:05.336902 13539 net.cpp:542] ctx_output3/relu_mbox_loc_flat -> ctx_output3/relu_mbox_loc_flat I1106 16:38:05.336935 13539 net.cpp:260] Setting up ctx_output3/relu_mbox_loc_flat I1106 16:38:05.336941 13539 net.cpp:267] TRAIN Top shape for layer 74 'ctx_output3/relu_mbox_loc_flat' 4 1440 (5760) I1106 16:38:05.336943 13539 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf' of type 'Convolution' I1106 16:38:05.336946 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.336954 13539 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf (75) I1106 16:38:05.336958 13539 net.cpp:572] ctx_output3/relu_mbox_conf <- ctx_output3_ctx_output3/relu_0_split_1 I1106 16:38:05.336961 13539 net.cpp:542] ctx_output3/relu_mbox_conf -> ctx_output3/relu_mbox_conf I1106 16:38:05.337118 13539 net.cpp:260] Setting up ctx_output3/relu_mbox_conf I1106 16:38:05.337131 13539 net.cpp:267] TRAIN Top shape for layer 75 'ctx_output3/relu_mbox_conf' 4 12 5 12 (2880) I1106 16:38:05.337136 13539 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:05.337138 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.337143 13539 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf_perm (76) I1106 16:38:05.337146 13539 net.cpp:572] ctx_output3/relu_mbox_conf_perm <- ctx_output3/relu_mbox_conf I1106 16:38:05.337150 13539 net.cpp:542] ctx_output3/relu_mbox_conf_perm -> ctx_output3/relu_mbox_conf_perm I1106 16:38:05.337208 13539 net.cpp:260] Setting up ctx_output3/relu_mbox_conf_perm I1106 16:38:05.337211 13539 net.cpp:267] TRAIN Top shape for layer 76 'ctx_output3/relu_mbox_conf_perm' 4 5 12 12 (2880) I1106 16:38:05.337214 13539 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:05.337218 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.337221 13539 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf_flat (77) I1106 16:38:05.337224 13539 net.cpp:572] ctx_output3/relu_mbox_conf_flat <- ctx_output3/relu_mbox_conf_perm I1106 16:38:05.337227 13539 net.cpp:542] ctx_output3/relu_mbox_conf_flat -> ctx_output3/relu_mbox_conf_flat I1106 16:38:05.337265 13539 net.cpp:260] Setting up ctx_output3/relu_mbox_conf_flat I1106 16:38:05.337270 13539 net.cpp:267] TRAIN Top shape for layer 77 'ctx_output3/relu_mbox_conf_flat' 4 720 (2880) I1106 16:38:05.337273 13539 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:05.337275 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.337280 13539 net.cpp:200] Created Layer ctx_output3/relu_mbox_priorbox (78) I1106 16:38:05.337282 13539 net.cpp:572] ctx_output3/relu_mbox_priorbox <- ctx_output3_ctx_output3/relu_0_split_2 I1106 16:38:05.337285 13539 net.cpp:572] ctx_output3/relu_mbox_priorbox <- data_data_0_split_3 I1106 16:38:05.337289 13539 net.cpp:542] ctx_output3/relu_mbox_priorbox -> ctx_output3/relu_mbox_priorbox I1106 16:38:05.337304 13539 net.cpp:260] Setting up ctx_output3/relu_mbox_priorbox I1106 16:38:05.337308 13539 net.cpp:267] TRAIN Top shape for layer 78 'ctx_output3/relu_mbox_priorbox' 1 2 1440 (2880) I1106 16:38:05.337311 13539 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc' of type 'Convolution' I1106 16:38:05.337313 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.337321 13539 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc (79) I1106 16:38:05.337324 13539 net.cpp:572] ctx_output4/relu_mbox_loc <- ctx_output4_ctx_output4/relu_0_split_0 I1106 16:38:05.337327 13539 net.cpp:542] ctx_output4/relu_mbox_loc -> ctx_output4/relu_mbox_loc I1106 16:38:05.337497 13539 net.cpp:260] Setting up ctx_output4/relu_mbox_loc I1106 16:38:05.337503 13539 net.cpp:267] TRAIN Top shape for layer 79 'ctx_output4/relu_mbox_loc' 4 16 3 6 (1152) I1106 16:38:05.337507 13539 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:05.337510 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.337517 13539 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc_perm (80) I1106 16:38:05.337520 13539 net.cpp:572] ctx_output4/relu_mbox_loc_perm <- ctx_output4/relu_mbox_loc I1106 16:38:05.337523 13539 net.cpp:542] ctx_output4/relu_mbox_loc_perm -> ctx_output4/relu_mbox_loc_perm I1106 16:38:05.337579 13539 net.cpp:260] Setting up ctx_output4/relu_mbox_loc_perm I1106 16:38:05.337584 13539 net.cpp:267] TRAIN Top shape for layer 80 'ctx_output4/relu_mbox_loc_perm' 4 3 6 16 (1152) I1106 16:38:05.337586 13539 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:05.337589 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.337599 13539 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc_flat (81) I1106 16:38:05.337602 13539 net.cpp:572] ctx_output4/relu_mbox_loc_flat <- ctx_output4/relu_mbox_loc_perm I1106 16:38:05.337606 13539 net.cpp:542] ctx_output4/relu_mbox_loc_flat -> ctx_output4/relu_mbox_loc_flat I1106 16:38:05.337640 13539 net.cpp:260] Setting up ctx_output4/relu_mbox_loc_flat I1106 16:38:05.337646 13539 net.cpp:267] TRAIN Top shape for layer 81 'ctx_output4/relu_mbox_loc_flat' 4 288 (1152) I1106 16:38:05.337648 13539 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf' of type 'Convolution' I1106 16:38:05.337651 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.337659 13539 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf (82) I1106 16:38:05.337662 13539 net.cpp:572] ctx_output4/relu_mbox_conf <- ctx_output4_ctx_output4/relu_0_split_1 I1106 16:38:05.337666 13539 net.cpp:542] ctx_output4/relu_mbox_conf -> ctx_output4/relu_mbox_conf I1106 16:38:05.337816 13539 net.cpp:260] Setting up ctx_output4/relu_mbox_conf I1106 16:38:05.337822 13539 net.cpp:267] TRAIN Top shape for layer 82 'ctx_output4/relu_mbox_conf' 4 8 3 6 (576) I1106 16:38:05.337827 13539 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:05.337831 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.337836 13539 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf_perm (83) I1106 16:38:05.337839 13539 net.cpp:572] ctx_output4/relu_mbox_conf_perm <- ctx_output4/relu_mbox_conf I1106 16:38:05.337841 13539 net.cpp:542] ctx_output4/relu_mbox_conf_perm -> ctx_output4/relu_mbox_conf_perm I1106 16:38:05.337899 13539 net.cpp:260] Setting up ctx_output4/relu_mbox_conf_perm I1106 16:38:05.337904 13539 net.cpp:267] TRAIN Top shape for layer 83 'ctx_output4/relu_mbox_conf_perm' 4 3 6 8 (576) I1106 16:38:05.337908 13539 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:05.337909 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.337913 13539 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf_flat (84) I1106 16:38:05.337916 13539 net.cpp:572] ctx_output4/relu_mbox_conf_flat <- ctx_output4/relu_mbox_conf_perm I1106 16:38:05.337918 13539 net.cpp:542] ctx_output4/relu_mbox_conf_flat -> ctx_output4/relu_mbox_conf_flat I1106 16:38:05.337957 13539 net.cpp:260] Setting up ctx_output4/relu_mbox_conf_flat I1106 16:38:05.337962 13539 net.cpp:267] TRAIN Top shape for layer 84 'ctx_output4/relu_mbox_conf_flat' 4 144 (576) I1106 16:38:05.337965 13539 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:05.337968 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.337975 13539 net.cpp:200] Created Layer ctx_output4/relu_mbox_priorbox (85) I1106 16:38:05.337977 13539 net.cpp:572] ctx_output4/relu_mbox_priorbox <- ctx_output4_ctx_output4/relu_0_split_2 I1106 16:38:05.337981 13539 net.cpp:572] ctx_output4/relu_mbox_priorbox <- data_data_0_split_4 I1106 16:38:05.337985 13539 net.cpp:542] ctx_output4/relu_mbox_priorbox -> ctx_output4/relu_mbox_priorbox I1106 16:38:05.337999 13539 net.cpp:260] Setting up ctx_output4/relu_mbox_priorbox I1106 16:38:05.338003 13539 net.cpp:267] TRAIN Top shape for layer 85 'ctx_output4/relu_mbox_priorbox' 1 2 288 (576) I1106 16:38:05.338006 13539 layer_factory.hpp:172] Creating layer 'mbox_loc' of type 'Concat' I1106 16:38:05.338008 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.338014 13539 net.cpp:200] Created Layer mbox_loc (86) I1106 16:38:05.338017 13539 net.cpp:572] mbox_loc <- ctx_output1/relu_mbox_loc_flat I1106 16:38:05.338021 13539 net.cpp:572] mbox_loc <- ctx_output2/relu_mbox_loc_flat I1106 16:38:05.338024 13539 net.cpp:572] mbox_loc <- ctx_output3/relu_mbox_loc_flat I1106 16:38:05.338032 13539 net.cpp:572] mbox_loc <- ctx_output4/relu_mbox_loc_flat I1106 16:38:05.338037 13539 net.cpp:542] mbox_loc -> mbox_loc I1106 16:38:05.338052 13539 net.cpp:260] Setting up mbox_loc I1106 16:38:05.338057 13539 net.cpp:267] TRAIN Top shape for layer 86 'mbox_loc' 4 22848 (91392) I1106 16:38:05.338059 13539 layer_factory.hpp:172] Creating layer 'mbox_conf' of type 'Concat' I1106 16:38:05.338062 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.338065 13539 net.cpp:200] Created Layer mbox_conf (87) I1106 16:38:05.338068 13539 net.cpp:572] mbox_conf <- ctx_output1/relu_mbox_conf_flat I1106 16:38:05.338071 13539 net.cpp:572] mbox_conf <- ctx_output2/relu_mbox_conf_flat I1106 16:38:05.338074 13539 net.cpp:572] mbox_conf <- ctx_output3/relu_mbox_conf_flat I1106 16:38:05.338078 13539 net.cpp:572] mbox_conf <- ctx_output4/relu_mbox_conf_flat I1106 16:38:05.338080 13539 net.cpp:542] mbox_conf -> mbox_conf I1106 16:38:05.338094 13539 net.cpp:260] Setting up mbox_conf I1106 16:38:05.338099 13539 net.cpp:267] TRAIN Top shape for layer 87 'mbox_conf' 4 11424 (45696) I1106 16:38:05.338101 13539 layer_factory.hpp:172] Creating layer 'mbox_priorbox' of type 'Concat' I1106 16:38:05.338104 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.338107 13539 net.cpp:200] Created Layer mbox_priorbox (88) I1106 16:38:05.338109 13539 net.cpp:572] mbox_priorbox <- ctx_output1/relu_mbox_priorbox I1106 16:38:05.338114 13539 net.cpp:572] mbox_priorbox <- ctx_output2/relu_mbox_priorbox I1106 16:38:05.338115 13539 net.cpp:572] mbox_priorbox <- ctx_output3/relu_mbox_priorbox I1106 16:38:05.338119 13539 net.cpp:572] mbox_priorbox <- ctx_output4/relu_mbox_priorbox I1106 16:38:05.338121 13539 net.cpp:542] mbox_priorbox -> mbox_priorbox I1106 16:38:05.338135 13539 net.cpp:260] Setting up mbox_priorbox I1106 16:38:05.338140 13539 net.cpp:267] TRAIN Top shape for layer 88 'mbox_priorbox' 1 2 22848 (45696) I1106 16:38:05.338143 13539 layer_factory.hpp:172] Creating layer 'mbox_loss' of type 'MultiBoxLoss' I1106 16:38:05.338146 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.338153 13539 net.cpp:200] Created Layer mbox_loss (89) I1106 16:38:05.338156 13539 net.cpp:572] mbox_loss <- mbox_loc I1106 16:38:05.338158 13539 net.cpp:572] mbox_loss <- mbox_conf I1106 16:38:05.338161 13539 net.cpp:572] mbox_loss <- mbox_priorbox I1106 16:38:05.338165 13539 net.cpp:572] mbox_loss <- label I1106 16:38:05.338167 13539 net.cpp:542] mbox_loss -> mbox_loss I1106 16:38:05.338210 13539 layer_factory.hpp:172] Creating layer 'mbox_loss_smooth_L1_loc' of type 'SmoothL1Loss' I1106 16:38:05.338213 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.338270 13539 layer_factory.hpp:172] Creating layer 'mbox_loss_softmax_conf' of type 'SoftmaxWithLoss' I1106 16:38:05.338274 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.338338 13539 net.cpp:260] Setting up mbox_loss I1106 16:38:05.338342 13539 net.cpp:267] TRAIN Top shape for layer 89 'mbox_loss' (1) I1106 16:38:05.338346 13539 net.cpp:271] with loss weight 1 I1106 16:38:05.338356 13539 net.cpp:336] mbox_loss needs backward computation. I1106 16:38:05.338361 13539 net.cpp:338] mbox_priorbox does not need backward computation. I1106 16:38:05.338364 13539 net.cpp:336] mbox_conf needs backward computation. I1106 16:38:05.338367 13539 net.cpp:336] mbox_loc needs backward computation. I1106 16:38:05.338371 13539 net.cpp:338] ctx_output4/relu_mbox_priorbox does not need backward computation. I1106 16:38:05.338374 13539 net.cpp:336] ctx_output4/relu_mbox_conf_flat needs backward computation. I1106 16:38:05.338377 13539 net.cpp:336] ctx_output4/relu_mbox_conf_perm needs backward computation. I1106 16:38:05.338380 13539 net.cpp:336] ctx_output4/relu_mbox_conf needs backward computation. I1106 16:38:05.338388 13539 net.cpp:336] ctx_output4/relu_mbox_loc_flat needs backward computation. I1106 16:38:05.338392 13539 net.cpp:336] ctx_output4/relu_mbox_loc_perm needs backward computation. I1106 16:38:05.338393 13539 net.cpp:336] ctx_output4/relu_mbox_loc needs backward computation. I1106 16:38:05.338397 13539 net.cpp:338] ctx_output3/relu_mbox_priorbox does not need backward computation. I1106 16:38:05.338398 13539 net.cpp:336] ctx_output3/relu_mbox_conf_flat needs backward computation. I1106 16:38:05.338402 13539 net.cpp:336] ctx_output3/relu_mbox_conf_perm needs backward computation. I1106 16:38:05.338403 13539 net.cpp:336] ctx_output3/relu_mbox_conf needs backward computation. I1106 16:38:05.338405 13539 net.cpp:336] ctx_output3/relu_mbox_loc_flat needs backward computation. I1106 16:38:05.338408 13539 net.cpp:336] ctx_output3/relu_mbox_loc_perm needs backward computation. I1106 16:38:05.338412 13539 net.cpp:336] ctx_output3/relu_mbox_loc needs backward computation. I1106 16:38:05.338413 13539 net.cpp:338] ctx_output2/relu_mbox_priorbox does not need backward computation. I1106 16:38:05.338416 13539 net.cpp:336] ctx_output2/relu_mbox_conf_flat needs backward computation. I1106 16:38:05.338419 13539 net.cpp:336] ctx_output2/relu_mbox_conf_perm needs backward computation. I1106 16:38:05.338423 13539 net.cpp:336] ctx_output2/relu_mbox_conf needs backward computation. I1106 16:38:05.338424 13539 net.cpp:336] ctx_output2/relu_mbox_loc_flat needs backward computation. I1106 16:38:05.338428 13539 net.cpp:336] ctx_output2/relu_mbox_loc_perm needs backward computation. I1106 16:38:05.338429 13539 net.cpp:336] ctx_output2/relu_mbox_loc needs backward computation. I1106 16:38:05.338431 13539 net.cpp:338] ctx_output1/relu_mbox_priorbox does not need backward computation. I1106 16:38:05.338434 13539 net.cpp:336] ctx_output1/relu_mbox_conf_flat needs backward computation. I1106 16:38:05.338438 13539 net.cpp:336] ctx_output1/relu_mbox_conf_perm needs backward computation. I1106 16:38:05.338439 13539 net.cpp:336] ctx_output1/relu_mbox_conf needs backward computation. I1106 16:38:05.338443 13539 net.cpp:336] ctx_output1/relu_mbox_loc_flat needs backward computation. I1106 16:38:05.338443 13539 net.cpp:336] ctx_output1/relu_mbox_loc_perm needs backward computation. I1106 16:38:05.338446 13539 net.cpp:336] ctx_output1/relu_mbox_loc needs backward computation. I1106 16:38:05.338449 13539 net.cpp:338] ctx_output5/relu does not need backward computation. I1106 16:38:05.338451 13539 net.cpp:338] ctx_output5 does not need backward computation. I1106 16:38:05.338454 13539 net.cpp:336] ctx_output4_ctx_output4/relu_0_split needs backward computation. I1106 16:38:05.338456 13539 net.cpp:336] ctx_output4/relu needs backward computation. I1106 16:38:05.338459 13539 net.cpp:336] ctx_output4 needs backward computation. I1106 16:38:05.338461 13539 net.cpp:336] ctx_output3_ctx_output3/relu_0_split needs backward computation. I1106 16:38:05.338464 13539 net.cpp:336] ctx_output3/relu needs backward computation. I1106 16:38:05.338466 13539 net.cpp:336] ctx_output3 needs backward computation. I1106 16:38:05.338469 13539 net.cpp:336] ctx_output2_ctx_output2/relu_0_split needs backward computation. I1106 16:38:05.338470 13539 net.cpp:336] ctx_output2/relu needs backward computation. I1106 16:38:05.338472 13539 net.cpp:336] ctx_output2 needs backward computation. I1106 16:38:05.338475 13539 net.cpp:336] ctx_output1_ctx_output1/relu_0_split needs backward computation. I1106 16:38:05.338477 13539 net.cpp:336] ctx_output1/relu needs backward computation. I1106 16:38:05.338479 13539 net.cpp:336] ctx_output1 needs backward computation. I1106 16:38:05.338482 13539 net.cpp:338] pool8 does not need backward computation. I1106 16:38:05.338485 13539 net.cpp:336] pool7_pool7_0_split needs backward computation. I1106 16:38:05.338488 13539 net.cpp:336] pool7 needs backward computation. I1106 16:38:05.338491 13539 net.cpp:336] pool6_pool6_0_split needs backward computation. I1106 16:38:05.338495 13539 net.cpp:336] pool6 needs backward computation. I1106 16:38:05.338501 13539 net.cpp:336] res5a_branch2b_res5a_branch2b/relu_0_split needs backward computation. I1106 16:38:05.338505 13539 net.cpp:336] res5a_branch2b/relu needs backward computation. I1106 16:38:05.338507 13539 net.cpp:336] res5a_branch2b/bn needs backward computation. I1106 16:38:05.338508 13539 net.cpp:336] res5a_branch2b needs backward computation. I1106 16:38:05.338511 13539 net.cpp:336] res5a_branch2a/relu needs backward computation. I1106 16:38:05.338513 13539 net.cpp:336] res5a_branch2a/bn needs backward computation. I1106 16:38:05.338515 13539 net.cpp:336] res5a_branch2a needs backward computation. I1106 16:38:05.338517 13539 net.cpp:336] pool4 needs backward computation. I1106 16:38:05.338521 13539 net.cpp:336] res4a_branch2b_res4a_branch2b/relu_0_split needs backward computation. I1106 16:38:05.338522 13539 net.cpp:336] res4a_branch2b/relu needs backward computation. I1106 16:38:05.338526 13539 net.cpp:336] res4a_branch2b/bn needs backward computation. I1106 16:38:05.338526 13539 net.cpp:336] res4a_branch2b needs backward computation. I1106 16:38:05.338529 13539 net.cpp:336] res4a_branch2a/relu needs backward computation. I1106 16:38:05.338531 13539 net.cpp:336] res4a_branch2a/bn needs backward computation. I1106 16:38:05.338534 13539 net.cpp:336] res4a_branch2a needs backward computation. I1106 16:38:05.338536 13539 net.cpp:336] pool3 needs backward computation. I1106 16:38:05.338538 13539 net.cpp:336] res3a_branch2b/relu needs backward computation. I1106 16:38:05.338541 13539 net.cpp:336] res3a_branch2b/bn needs backward computation. I1106 16:38:05.338542 13539 net.cpp:336] res3a_branch2b needs backward computation. I1106 16:38:05.338544 13539 net.cpp:336] res3a_branch2a/relu needs backward computation. I1106 16:38:05.338547 13539 net.cpp:336] res3a_branch2a/bn needs backward computation. I1106 16:38:05.338549 13539 net.cpp:336] res3a_branch2a needs backward computation. I1106 16:38:05.338552 13539 net.cpp:336] pool2 needs backward computation. I1106 16:38:05.338554 13539 net.cpp:336] res2a_branch2b/relu needs backward computation. I1106 16:38:05.338557 13539 net.cpp:336] res2a_branch2b/bn needs backward computation. I1106 16:38:05.338559 13539 net.cpp:336] res2a_branch2b needs backward computation. I1106 16:38:05.338562 13539 net.cpp:336] res2a_branch2a/relu needs backward computation. I1106 16:38:05.338564 13539 net.cpp:336] res2a_branch2a/bn needs backward computation. I1106 16:38:05.338567 13539 net.cpp:336] res2a_branch2a needs backward computation. I1106 16:38:05.338568 13539 net.cpp:336] pool1 needs backward computation. I1106 16:38:05.338572 13539 net.cpp:336] conv1b/relu needs backward computation. I1106 16:38:05.338572 13539 net.cpp:336] conv1b/bn needs backward computation. I1106 16:38:05.338575 13539 net.cpp:336] conv1b needs backward computation. I1106 16:38:05.338577 13539 net.cpp:336] conv1a/relu needs backward computation. I1106 16:38:05.338579 13539 net.cpp:336] conv1a/bn needs backward computation. I1106 16:38:05.338582 13539 net.cpp:336] conv1a needs backward computation. I1106 16:38:05.338584 13539 net.cpp:338] data/bias does not need backward computation. I1106 16:38:05.338588 13539 net.cpp:338] data_data_0_split does not need backward computation. I1106 16:38:05.338591 13539 net.cpp:338] data does not need backward computation. I1106 16:38:05.338593 13539 net.cpp:380] This network produces output ctx_output5 I1106 16:38:05.338596 13539 net.cpp:380] This network produces output mbox_loss I1106 16:38:05.338652 13539 net.cpp:403] Top memory (TRAIN) required for data: 505556136 diff: 505556136 I1106 16:38:05.338655 13539 net.cpp:406] Bottom memory (TRAIN) required for data: 505531552 diff: 505531552 I1106 16:38:05.338657 13539 net.cpp:409] Shared (in-place) memory (TRAIN) by data: 249053184 diff: 249053184 I1106 16:38:05.338660 13539 net.cpp:412] Parameters memory (TRAIN) required for data: 11946688 diff: 11946688 I1106 16:38:05.338661 13539 net.cpp:415] Parameters shared memory (TRAIN) by data: 0 diff: 0 I1106 16:38:05.338663 13539 net.cpp:421] Network initialization done. I1106 16:38:05.339193 13539 solver.cpp:175] Creating test net (#0) specified by test_net file: training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/l1reg/test.prototxt I1106 16:38:05.339604 13539 net.cpp:80] Initializing net from parameters: name: "ssdJacintoNetV2_test" state { phase: TEST } layer { name: "data" type: "AnnotatedData" top: "data" top: "label" include { phase: TEST } transform_param { mean_value: 0 mean_value: 0 mean_value: 0 force_color: false resize_param { prob: 1 resize_mode: WARP height: 320 width: 768 interp_mode: LINEAR } crop_h: 320 crop_w: 768 } data_param { source: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_test_lmdb" batch_size: 8 backend: LMDB threads: 4 parser_threads: 4 } annotated_data_param { batch_sampler { } label_map_file: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/labelmap.prototxt" } } layer { name: "data/bias" type: "Bias" bottom: "data" top: "data/bias" param { lr_mult: 0 decay_mult: 0 } bias_param { filler { type: "constant" value: -128 } } } layer { name: "conv1a" type: "Convolution" bottom: "data/bias" top: "conv1a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 32 bias_term: true pad: 2 kernel_size: 5 group: 1 stride: 2 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "conv1a/bn" type: "BatchNorm" bottom: "conv1a" top: "conv1a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "conv1a/relu" type: "ReLU" bottom: "conv1a" top: "conv1a" } layer { name: "conv1b" type: "Convolution" bottom: "conv1a" top: "conv1b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 32 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "conv1b/bn" type: "BatchNorm" bottom: "conv1b" top: "conv1b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "conv1b/relu" type: "ReLU" bottom: "conv1b" top: "conv1b" } layer { name: "pool1" type: "Pooling" bottom: "conv1b" top: "pool1" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res2a_branch2a" type: "Convolution" bottom: "pool1" top: "res2a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 64 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res2a_branch2a/bn" type: "BatchNorm" bottom: "res2a_branch2a" top: "res2a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res2a_branch2a/relu" type: "ReLU" bottom: "res2a_branch2a" top: "res2a_branch2a" } layer { name: "res2a_branch2b" type: "Convolution" bottom: "res2a_branch2a" top: "res2a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 64 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res2a_branch2b/bn" type: "BatchNorm" bottom: "res2a_branch2b" top: "res2a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res2a_branch2b/relu" type: "ReLU" bottom: "res2a_branch2b" top: "res2a_branch2b" } layer { name: "pool2" type: "Pooling" bottom: "res2a_branch2b" top: "pool2" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res3a_branch2a" type: "Convolution" bottom: "pool2" top: "res3a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 128 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res3a_branch2a/bn" type: "BatchNorm" bottom: "res3a_branch2a" top: "res3a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res3a_branch2a/relu" type: "ReLU" bottom: "res3a_branch2a" top: "res3a_branch2a" } layer { name: "res3a_branch2b" type: "Convolution" bottom: "res3a_branch2a" top: "res3a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 128 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res3a_branch2b/bn" type: "BatchNorm" bottom: "res3a_branch2b" top: "res3a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res3a_branch2b/relu" type: "ReLU" bottom: "res3a_branch2b" top: "res3a_branch2b" } layer { name: "pool3" type: "Pooling" bottom: "res3a_branch2b" top: "pool3" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res4a_branch2a" type: "Convolution" bottom: "pool3" top: "res4a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res4a_branch2a/bn" type: "BatchNorm" bottom: "res4a_branch2a" top: "res4a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res4a_branch2a/relu" type: "ReLU" bottom: "res4a_branch2a" top: "res4a_branch2a" } layer { name: "res4a_branch2b" type: "Convolution" bottom: "res4a_branch2a" top: "res4a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res4a_branch2b/bn" type: "BatchNorm" bottom: "res4a_branch2b" top: "res4a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res4a_branch2b/relu" type: "ReLU" bottom: "res4a_branch2b" top: "res4a_branch2b" } layer { name: "pool4" type: "Pooling" bottom: "res4a_branch2b" top: "pool4" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res5a_branch2a" type: "Convolution" bottom: "pool4" top: "res5a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 512 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res5a_branch2a/bn" type: "BatchNorm" bottom: "res5a_branch2a" top: "res5a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res5a_branch2a/relu" type: "ReLU" bottom: "res5a_branch2a" top: "res5a_branch2a" } layer { name: "res5a_branch2b" type: "Convolution" bottom: "res5a_branch2a" top: "res5a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 512 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res5a_branch2b/bn" type: "BatchNorm" bottom: "res5a_branch2b" top: "res5a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res5a_branch2b/relu" type: "ReLU" bottom: "res5a_branch2b" top: "res5a_branch2b" } layer { name: "pool6" type: "Pooling" bottom: "res5a_branch2b" top: "pool6" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "pool7" type: "Pooling" bottom: "pool6" top: "pool7" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "pool8" type: "Pooling" bottom: "pool7" top: "pool8" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "ctx_output1" type: "Convolution" bottom: "res4a_branch2b" top: "ctx_output1" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu" type: "ReLU" bottom: "ctx_output1" top: "ctx_output1" } layer { name: "ctx_output2" type: "Convolution" bottom: "res5a_branch2b" top: "ctx_output2" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu" type: "ReLU" bottom: "ctx_output2" top: "ctx_output2" } layer { name: "ctx_output3" type: "Convolution" bottom: "pool6" top: "ctx_output3" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu" type: "ReLU" bottom: "ctx_output3" top: "ctx_output3" } layer { name: "ctx_output4" type: "Convolution" bottom: "pool7" top: "ctx_output4" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu" type: "ReLU" bottom: "ctx_output4" top: "ctx_output4" } layer { name: "ctx_output5" type: "Convolution" bottom: "pool8" top: "ctx_output5" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output5/relu" type: "ReLU" bottom: "ctx_output5" top: "ctx_output5" } layer { name: "ctx_output1/relu_mbox_loc" type: "Convolution" bottom: "ctx_output1" top: "ctx_output1/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 16 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output1/relu_mbox_loc" top: "ctx_output1/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output1/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output1/relu_mbox_loc_perm" top: "ctx_output1/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output1/relu_mbox_conf" type: "Convolution" bottom: "ctx_output1" top: "ctx_output1/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 8 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output1/relu_mbox_conf" top: "ctx_output1/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output1/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output1/relu_mbox_conf_perm" top: "ctx_output1/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output1/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output1" bottom: "data" top: "ctx_output1/relu_mbox_priorbox" prior_box_param { min_size: 14.72 max_size: 36.8 aspect_ratio: 2 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output2/relu_mbox_loc" type: "Convolution" bottom: "ctx_output2" top: "ctx_output2/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 24 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output2/relu_mbox_loc" top: "ctx_output2/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output2/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output2/relu_mbox_loc_perm" top: "ctx_output2/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output2/relu_mbox_conf" type: "Convolution" bottom: "ctx_output2" top: "ctx_output2/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 12 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output2/relu_mbox_conf" top: "ctx_output2/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output2/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output2/relu_mbox_conf_perm" top: "ctx_output2/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output2/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output2" bottom: "data" top: "ctx_output2/relu_mbox_priorbox" prior_box_param { min_size: 36.8 max_size: 132.48 aspect_ratio: 2 aspect_ratio: 3 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output3/relu_mbox_loc" type: "Convolution" bottom: "ctx_output3" top: "ctx_output3/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 24 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output3/relu_mbox_loc" top: "ctx_output3/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output3/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output3/relu_mbox_loc_perm" top: "ctx_output3/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output3/relu_mbox_conf" type: "Convolution" bottom: "ctx_output3" top: "ctx_output3/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 12 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output3/relu_mbox_conf" top: "ctx_output3/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output3/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output3/relu_mbox_conf_perm" top: "ctx_output3/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output3/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output3" bottom: "data" top: "ctx_output3/relu_mbox_priorbox" prior_box_param { min_size: 132.48 max_size: 228.16 aspect_ratio: 2 aspect_ratio: 3 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output4/relu_mbox_loc" type: "Convolution" bottom: "ctx_output4" top: "ctx_output4/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 16 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output4/relu_mbox_loc" top: "ctx_output4/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output4/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output4/relu_mbox_loc_perm" top: "ctx_output4/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output4/relu_mbox_conf" type: "Convolution" bottom: "ctx_output4" top: "ctx_output4/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 8 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output4/relu_mbox_conf" top: "ctx_output4/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output4/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output4/relu_mbox_conf_perm" top: "ctx_output4/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output4/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output4" bottom: "data" top: "ctx_output4/relu_mbox_priorbox" prior_box_param { min_size: 228.16 max_size: 323.84 aspect_ratio: 2 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "mbox_loc" type: "Concat" bottom: "ctx_output1/relu_mbox_loc_flat" bottom: "ctx_output2/relu_mbox_loc_flat" bottom: "ctx_output3/relu_mbox_loc_flat" bottom: "ctx_output4/relu_mbox_loc_flat" top: "mbox_loc" concat_param { axis: 1 } } layer { name: "mbox_conf" type: "Concat" bottom: "ctx_output1/relu_mbox_conf_flat" bottom: "ctx_output2/relu_mbox_conf_flat" bottom: "ctx_output3/relu_mbox_conf_flat" bottom: "ctx_output4/relu_mbox_conf_flat" top: "mbox_conf" concat_param { axis: 1 } } layer { name: "mbox_priorbox" type: "Concat" bottom: "ctx_output1/relu_mbox_priorbox" bottom: "ctx_output2/relu_mbox_priorbox" bottom: "ctx_output3/relu_mbox_priorbox" bottom: "ctx_output4/relu_mbox_priorbox" top: "mbox_priorbox" concat_param { axis: 2 } } layer { name: "mbox_conf_reshape" type: "Reshape" bottom: "mbox_conf" top: "mbox_conf_reshape" reshape_param { shape { dim: 0 dim: -1 dim: 2 } } } layer { name: "mbox_conf_softmax" type: "Softmax" bottom: "mbox_conf_reshape" top: "mbox_conf_softmax" softmax_param { axis: 2 } } layer { name: "mbox_conf_flatten" type: "Flatten" bottom: "mbox_conf_softmax" top: "mbox_conf_flatten" flatten_param { axis: 1 } } layer { name: "detection_out" type: "DetectionOutput" bottom: "mbox_loc" bottom: "mbox_conf_flatten" bottom: "mbox_priorbox" top: "detection_out" include { phase: TEST } detection_output_param { num_classes: 2 share_location: true background_label_id: 0 nms_param { nms_threshold: 0.45 top_k: 400 } save_output_param { output_directory: "" output_name_prefix: "comp4_det_test_" output_format: "VOC" label_map_file: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/labelmap.prototxt" name_size_file: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/test_name_size.txt" num_test_image: 24 } code_type: CENTER_SIZE keep_top_k: 200 confidence_threshold: 0.01 } } layer { name: "detection_eval" type: "DetectionEvaluate" bottom: "detection_out" bottom: "label" top: "detection_eval" include { phase: TEST } detection_evaluate_param { num_classes: 2 background_label_id: 0 overlap_threshold: 0.5 evaluate_difficult_gt: false name_size_file: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/test_name_size.txt" } } I1106 16:38:05.339851 13539 net.cpp:110] Using FLOAT as default forward math type I1106 16:38:05.339857 13539 net.cpp:116] Using FLOAT as default backward math type I1106 16:38:05.339859 13539 layer_factory.hpp:172] Creating layer 'data' of type 'AnnotatedData' I1106 16:38:05.339862 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.339874 13539 internal_thread.cpp:19] Starting 1 internal thread(s) on device 0 I1106 16:38:05.340042 13539 net.cpp:200] Created Layer data (0) I1106 16:38:05.340046 13539 net.cpp:542] data -> data I1106 16:38:05.340050 13539 net.cpp:542] data -> label I1106 16:38:05.340054 13539 data_reader.cpp:58] Data Reader threads: 1, out queues: 1, depth: 8 I1106 16:38:05.340386 13539 internal_thread.cpp:19] Starting 1 internal thread(s) on device 0 I1106 16:38:05.340939 13569 db_lmdb.cpp:36] Opened lmdb /home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_test_lmdb I1106 16:38:05.341349 13539 annotated_data_layer.cpp:105] output data size: 8,3,320,768 I1106 16:38:05.341392 13539 annotated_data_layer.cpp:150] (0) Output data size: 8, 3, 320, 768 I1106 16:38:05.341413 13539 internal_thread.cpp:19] Starting 1 internal thread(s) on device 0 I1106 16:38:05.341439 13539 net.cpp:260] Setting up data I1106 16:38:05.341459 13539 net.cpp:267] TEST Top shape for layer 0 'data' 8 3 320 768 (5898240) I1106 16:38:05.341461 13539 net.cpp:267] TEST Top shape for layer 0 'data' 1 1 2 8 (16) I1106 16:38:05.341477 13539 layer_factory.hpp:172] Creating layer 'data_data_0_split' of type 'Split' I1106 16:38:05.341480 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.341483 13539 net.cpp:200] Created Layer data_data_0_split (1) I1106 16:38:05.341778 13570 data_layer.cpp:105] (0) Parser threads: 1 I1106 16:38:05.341784 13570 data_layer.cpp:107] (0) Transformer threads: 1 I1106 16:38:05.341801 13539 net.cpp:572] data_data_0_split <- data I1106 16:38:05.341825 13539 net.cpp:542] data_data_0_split -> data_data_0_split_0 I1106 16:38:05.341830 13539 net.cpp:542] data_data_0_split -> data_data_0_split_1 I1106 16:38:05.341833 13539 net.cpp:542] data_data_0_split -> data_data_0_split_2 I1106 16:38:05.341836 13539 net.cpp:542] data_data_0_split -> data_data_0_split_3 I1106 16:38:05.341840 13539 net.cpp:542] data_data_0_split -> data_data_0_split_4 I1106 16:38:05.341935 13539 net.cpp:260] Setting up data_data_0_split I1106 16:38:05.341940 13539 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 8 3 320 768 (5898240) I1106 16:38:05.341956 13539 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 8 3 320 768 (5898240) I1106 16:38:05.341959 13539 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 8 3 320 768 (5898240) I1106 16:38:05.341962 13539 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 8 3 320 768 (5898240) I1106 16:38:05.341965 13539 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 8 3 320 768 (5898240) I1106 16:38:05.341969 13539 layer_factory.hpp:172] Creating layer 'data/bias' of type 'Bias' I1106 16:38:05.341971 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.341977 13539 net.cpp:200] Created Layer data/bias (2) I1106 16:38:05.341980 13539 net.cpp:572] data/bias <- data_data_0_split_0 I1106 16:38:05.341982 13539 net.cpp:542] data/bias -> data/bias I1106 16:38:05.342543 13539 net.cpp:260] Setting up data/bias I1106 16:38:05.342550 13539 net.cpp:267] TEST Top shape for layer 2 'data/bias' 8 3 320 768 (5898240) I1106 16:38:05.342556 13539 layer_factory.hpp:172] Creating layer 'conv1a' of type 'Convolution' I1106 16:38:05.342559 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.342566 13539 net.cpp:200] Created Layer conv1a (3) I1106 16:38:05.342569 13539 net.cpp:572] conv1a <- data/bias I1106 16:38:05.342573 13539 net.cpp:542] conv1a -> conv1a I1106 16:38:05.347399 13539 net.cpp:260] Setting up conv1a I1106 16:38:05.347477 13539 net.cpp:267] TEST Top shape for layer 3 'conv1a' 8 32 160 384 (15728640) I1106 16:38:05.347501 13539 layer_factory.hpp:172] Creating layer 'conv1a/bn' of type 'BatchNorm' I1106 16:38:05.347513 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.347540 13539 net.cpp:200] Created Layer conv1a/bn (4) I1106 16:38:05.347546 13539 net.cpp:572] conv1a/bn <- conv1a I1106 16:38:05.347553 13539 net.cpp:527] conv1a/bn -> conv1a (in-place) I1106 16:38:05.348034 13539 net.cpp:260] Setting up conv1a/bn I1106 16:38:05.348044 13539 net.cpp:267] TEST Top shape for layer 4 'conv1a/bn' 8 32 160 384 (15728640) I1106 16:38:05.348054 13539 layer_factory.hpp:172] Creating layer 'conv1a/relu' of type 'ReLU' I1106 16:38:05.348065 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.348075 13539 net.cpp:200] Created Layer conv1a/relu (5) I1106 16:38:05.348081 13539 net.cpp:572] conv1a/relu <- conv1a I1106 16:38:05.348086 13539 net.cpp:527] conv1a/relu -> conv1a (in-place) I1106 16:38:05.348093 13539 net.cpp:260] Setting up conv1a/relu I1106 16:38:05.348096 13539 net.cpp:267] TEST Top shape for layer 5 'conv1a/relu' 8 32 160 384 (15728640) I1106 16:38:05.348126 13539 layer_factory.hpp:172] Creating layer 'conv1b' of type 'Convolution' I1106 16:38:05.348132 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.348163 13539 net.cpp:200] Created Layer conv1b (6) I1106 16:38:05.348170 13539 net.cpp:572] conv1b <- conv1a I1106 16:38:05.348176 13539 net.cpp:542] conv1b -> conv1b I1106 16:38:05.348471 13539 net.cpp:260] Setting up conv1b I1106 16:38:05.348485 13539 net.cpp:267] TEST Top shape for layer 6 'conv1b' 8 32 160 384 (15728640) I1106 16:38:05.348493 13539 layer_factory.hpp:172] Creating layer 'conv1b/bn' of type 'BatchNorm' I1106 16:38:05.348497 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.348508 13539 net.cpp:200] Created Layer conv1b/bn (7) I1106 16:38:05.348515 13539 net.cpp:572] conv1b/bn <- conv1b I1106 16:38:05.348520 13539 net.cpp:527] conv1b/bn -> conv1b (in-place) I1106 16:38:05.348884 13539 net.cpp:260] Setting up conv1b/bn I1106 16:38:05.348892 13539 net.cpp:267] TEST Top shape for layer 7 'conv1b/bn' 8 32 160 384 (15728640) I1106 16:38:05.348898 13539 layer_factory.hpp:172] Creating layer 'conv1b/relu' of type 'ReLU' I1106 16:38:05.348903 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.348923 13539 net.cpp:200] Created Layer conv1b/relu (8) I1106 16:38:05.348932 13539 net.cpp:572] conv1b/relu <- conv1b I1106 16:38:05.348938 13539 net.cpp:527] conv1b/relu -> conv1b (in-place) I1106 16:38:05.348950 13539 net.cpp:260] Setting up conv1b/relu I1106 16:38:05.348956 13539 net.cpp:267] TEST Top shape for layer 8 'conv1b/relu' 8 32 160 384 (15728640) I1106 16:38:05.348963 13539 layer_factory.hpp:172] Creating layer 'pool1' of type 'Pooling' I1106 16:38:05.348969 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.348987 13539 net.cpp:200] Created Layer pool1 (9) I1106 16:38:05.348994 13539 net.cpp:572] pool1 <- conv1b I1106 16:38:05.349002 13539 net.cpp:542] pool1 -> pool1 I1106 16:38:05.349063 13539 net.cpp:260] Setting up pool1 I1106 16:38:05.349069 13539 net.cpp:267] TEST Top shape for layer 9 'pool1' 8 32 80 192 (3932160) I1106 16:38:05.349071 13539 layer_factory.hpp:172] Creating layer 'res2a_branch2a' of type 'Convolution' I1106 16:38:05.349074 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.349092 13539 net.cpp:200] Created Layer res2a_branch2a (10) I1106 16:38:05.349094 13539 net.cpp:572] res2a_branch2a <- pool1 I1106 16:38:05.349097 13539 net.cpp:542] res2a_branch2a -> res2a_branch2a I1106 16:38:05.349443 13539 net.cpp:260] Setting up res2a_branch2a I1106 16:38:05.349452 13539 net.cpp:267] TEST Top shape for layer 10 'res2a_branch2a' 8 64 80 192 (7864320) I1106 16:38:05.349458 13539 layer_factory.hpp:172] Creating layer 'res2a_branch2a/bn' of type 'BatchNorm' I1106 16:38:05.349462 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.349467 13539 net.cpp:200] Created Layer res2a_branch2a/bn (11) I1106 16:38:05.349468 13539 net.cpp:572] res2a_branch2a/bn <- res2a_branch2a I1106 16:38:05.349472 13539 net.cpp:527] res2a_branch2a/bn -> res2a_branch2a (in-place) I1106 16:38:05.349715 13539 net.cpp:260] Setting up res2a_branch2a/bn I1106 16:38:05.349720 13539 net.cpp:267] TEST Top shape for layer 11 'res2a_branch2a/bn' 8 64 80 192 (7864320) I1106 16:38:05.349725 13539 layer_factory.hpp:172] Creating layer 'res2a_branch2a/relu' of type 'ReLU' I1106 16:38:05.349730 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.349733 13539 net.cpp:200] Created Layer res2a_branch2a/relu (12) I1106 16:38:05.349736 13539 net.cpp:572] res2a_branch2a/relu <- res2a_branch2a I1106 16:38:05.349738 13539 net.cpp:527] res2a_branch2a/relu -> res2a_branch2a (in-place) I1106 16:38:05.349742 13539 net.cpp:260] Setting up res2a_branch2a/relu I1106 16:38:05.349747 13539 net.cpp:267] TEST Top shape for layer 12 'res2a_branch2a/relu' 8 64 80 192 (7864320) I1106 16:38:05.349763 13539 layer_factory.hpp:172] Creating layer 'res2a_branch2b' of type 'Convolution' I1106 16:38:05.349766 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.349774 13539 net.cpp:200] Created Layer res2a_branch2b (13) I1106 16:38:05.349776 13539 net.cpp:572] res2a_branch2b <- res2a_branch2a I1106 16:38:05.349778 13539 net.cpp:542] res2a_branch2b -> res2a_branch2b I1106 16:38:05.350630 13539 net.cpp:260] Setting up res2a_branch2b I1106 16:38:05.350638 13539 net.cpp:267] TEST Top shape for layer 13 'res2a_branch2b' 8 64 80 192 (7864320) I1106 16:38:05.350643 13539 layer_factory.hpp:172] Creating layer 'res2a_branch2b/bn' of type 'BatchNorm' I1106 16:38:05.350648 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.350653 13539 net.cpp:200] Created Layer res2a_branch2b/bn (14) I1106 16:38:05.350656 13539 net.cpp:572] res2a_branch2b/bn <- res2a_branch2b I1106 16:38:05.350661 13539 net.cpp:527] res2a_branch2b/bn -> res2a_branch2b (in-place) I1106 16:38:05.352646 13539 net.cpp:260] Setting up res2a_branch2b/bn I1106 16:38:05.352669 13539 net.cpp:267] TEST Top shape for layer 14 'res2a_branch2b/bn' 8 64 80 192 (7864320) I1106 16:38:05.352679 13539 layer_factory.hpp:172] Creating layer 'res2a_branch2b/relu' of type 'ReLU' I1106 16:38:05.352694 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.352707 13539 net.cpp:200] Created Layer res2a_branch2b/relu (15) I1106 16:38:05.352713 13539 net.cpp:572] res2a_branch2b/relu <- res2a_branch2b I1106 16:38:05.352721 13539 net.cpp:527] res2a_branch2b/relu -> res2a_branch2b (in-place) I1106 16:38:05.352732 13539 net.cpp:260] Setting up res2a_branch2b/relu I1106 16:38:05.352741 13539 net.cpp:267] TEST Top shape for layer 15 'res2a_branch2b/relu' 8 64 80 192 (7864320) I1106 16:38:05.352746 13539 layer_factory.hpp:172] Creating layer 'pool2' of type 'Pooling' I1106 16:38:05.352751 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.352763 13539 net.cpp:200] Created Layer pool2 (16) I1106 16:38:05.352771 13539 net.cpp:572] pool2 <- res2a_branch2b I1106 16:38:05.352775 13539 net.cpp:542] pool2 -> pool2 I1106 16:38:05.352821 13539 net.cpp:260] Setting up pool2 I1106 16:38:05.352830 13539 net.cpp:267] TEST Top shape for layer 16 'pool2' 8 64 40 96 (1966080) I1106 16:38:05.352838 13539 layer_factory.hpp:172] Creating layer 'res3a_branch2a' of type 'Convolution' I1106 16:38:05.352843 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.352856 13539 net.cpp:200] Created Layer res3a_branch2a (17) I1106 16:38:05.352864 13539 net.cpp:572] res3a_branch2a <- pool2 I1106 16:38:05.352869 13539 net.cpp:542] res3a_branch2a -> res3a_branch2a I1106 16:38:05.353541 13539 net.cpp:260] Setting up res3a_branch2a I1106 16:38:05.353554 13539 net.cpp:267] TEST Top shape for layer 17 'res3a_branch2a' 8 128 40 96 (3932160) I1106 16:38:05.353561 13539 layer_factory.hpp:172] Creating layer 'res3a_branch2a/bn' of type 'BatchNorm' I1106 16:38:05.353567 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.353576 13539 net.cpp:200] Created Layer res3a_branch2a/bn (18) I1106 16:38:05.353581 13539 net.cpp:572] res3a_branch2a/bn <- res3a_branch2a I1106 16:38:05.353587 13539 net.cpp:527] res3a_branch2a/bn -> res3a_branch2a (in-place) I1106 16:38:05.353794 13539 net.cpp:260] Setting up res3a_branch2a/bn I1106 16:38:05.353890 13539 net.cpp:267] TEST Top shape for layer 18 'res3a_branch2a/bn' 8 128 40 96 (3932160) I1106 16:38:05.353910 13539 layer_factory.hpp:172] Creating layer 'res3a_branch2a/relu' of type 'ReLU' I1106 16:38:05.353914 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.353917 13539 net.cpp:200] Created Layer res3a_branch2a/relu (19) I1106 16:38:05.353929 13539 net.cpp:572] res3a_branch2a/relu <- res3a_branch2a I1106 16:38:05.353933 13539 net.cpp:527] res3a_branch2a/relu -> res3a_branch2a (in-place) I1106 16:38:05.353937 13539 net.cpp:260] Setting up res3a_branch2a/relu I1106 16:38:05.353940 13539 net.cpp:267] TEST Top shape for layer 19 'res3a_branch2a/relu' 8 128 40 96 (3932160) I1106 16:38:05.353943 13539 layer_factory.hpp:172] Creating layer 'res3a_branch2b' of type 'Convolution' I1106 16:38:05.353946 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.353956 13539 net.cpp:200] Created Layer res3a_branch2b (20) I1106 16:38:05.353960 13539 net.cpp:572] res3a_branch2b <- res3a_branch2a I1106 16:38:05.353961 13539 net.cpp:542] res3a_branch2b -> res3a_branch2b I1106 16:38:05.356884 13539 net.cpp:260] Setting up res3a_branch2b I1106 16:38:05.356894 13539 net.cpp:267] TEST Top shape for layer 20 'res3a_branch2b' 8 128 40 96 (3932160) I1106 16:38:05.356900 13539 layer_factory.hpp:172] Creating layer 'res3a_branch2b/bn' of type 'BatchNorm' I1106 16:38:05.356911 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.356923 13539 net.cpp:200] Created Layer res3a_branch2b/bn (21) I1106 16:38:05.356930 13539 net.cpp:572] res3a_branch2b/bn <- res3a_branch2b I1106 16:38:05.356936 13539 net.cpp:527] res3a_branch2b/bn -> res3a_branch2b (in-place) I1106 16:38:05.357144 13539 net.cpp:260] Setting up res3a_branch2b/bn I1106 16:38:05.357151 13539 net.cpp:267] TEST Top shape for layer 21 'res3a_branch2b/bn' 8 128 40 96 (3932160) I1106 16:38:05.357157 13539 layer_factory.hpp:172] Creating layer 'res3a_branch2b/relu' of type 'ReLU' I1106 16:38:05.357165 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.357172 13539 net.cpp:200] Created Layer res3a_branch2b/relu (22) I1106 16:38:05.357178 13539 net.cpp:572] res3a_branch2b/relu <- res3a_branch2b I1106 16:38:05.357182 13539 net.cpp:527] res3a_branch2b/relu -> res3a_branch2b (in-place) I1106 16:38:05.357187 13539 net.cpp:260] Setting up res3a_branch2b/relu I1106 16:38:05.357190 13539 net.cpp:267] TEST Top shape for layer 22 'res3a_branch2b/relu' 8 128 40 96 (3932160) I1106 16:38:05.357193 13539 layer_factory.hpp:172] Creating layer 'pool3' of type 'Pooling' I1106 16:38:05.357195 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.357199 13539 net.cpp:200] Created Layer pool3 (23) I1106 16:38:05.357201 13539 net.cpp:572] pool3 <- res3a_branch2b I1106 16:38:05.357203 13539 net.cpp:542] pool3 -> pool3 I1106 16:38:05.357236 13539 net.cpp:260] Setting up pool3 I1106 16:38:05.357241 13539 net.cpp:267] TEST Top shape for layer 23 'pool3' 8 128 20 48 (983040) I1106 16:38:05.357244 13539 layer_factory.hpp:172] Creating layer 'res4a_branch2a' of type 'Convolution' I1106 16:38:05.357251 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.357261 13539 net.cpp:200] Created Layer res4a_branch2a (24) I1106 16:38:05.357264 13539 net.cpp:572] res4a_branch2a <- pool3 I1106 16:38:05.357265 13539 net.cpp:542] res4a_branch2a -> res4a_branch2a I1106 16:38:05.359443 13539 net.cpp:260] Setting up res4a_branch2a I1106 16:38:05.359452 13539 net.cpp:267] TEST Top shape for layer 24 'res4a_branch2a' 8 256 20 48 (1966080) I1106 16:38:05.359457 13539 layer_factory.hpp:172] Creating layer 'res4a_branch2a/bn' of type 'BatchNorm' I1106 16:38:05.359465 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.359477 13539 net.cpp:200] Created Layer res4a_branch2a/bn (25) I1106 16:38:05.359483 13539 net.cpp:572] res4a_branch2a/bn <- res4a_branch2a I1106 16:38:05.359489 13539 net.cpp:527] res4a_branch2a/bn -> res4a_branch2a (in-place) I1106 16:38:05.359711 13539 net.cpp:260] Setting up res4a_branch2a/bn I1106 16:38:05.359717 13539 net.cpp:267] TEST Top shape for layer 25 'res4a_branch2a/bn' 8 256 20 48 (1966080) I1106 16:38:05.359737 13539 layer_factory.hpp:172] Creating layer 'res4a_branch2a/relu' of type 'ReLU' I1106 16:38:05.359740 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.359745 13539 net.cpp:200] Created Layer res4a_branch2a/relu (26) I1106 16:38:05.359752 13539 net.cpp:572] res4a_branch2a/relu <- res4a_branch2a I1106 16:38:05.359756 13539 net.cpp:527] res4a_branch2a/relu -> res4a_branch2a (in-place) I1106 16:38:05.359760 13539 net.cpp:260] Setting up res4a_branch2a/relu I1106 16:38:05.359763 13539 net.cpp:267] TEST Top shape for layer 26 'res4a_branch2a/relu' 8 256 20 48 (1966080) I1106 16:38:05.359766 13539 layer_factory.hpp:172] Creating layer 'res4a_branch2b' of type 'Convolution' I1106 16:38:05.359768 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.359776 13539 net.cpp:200] Created Layer res4a_branch2b (27) I1106 16:38:05.359783 13539 net.cpp:572] res4a_branch2b <- res4a_branch2a I1106 16:38:05.359788 13539 net.cpp:542] res4a_branch2b -> res4a_branch2b I1106 16:38:05.361479 13539 net.cpp:260] Setting up res4a_branch2b I1106 16:38:05.361495 13539 net.cpp:267] TEST Top shape for layer 27 'res4a_branch2b' 8 256 20 48 (1966080) I1106 16:38:05.361503 13539 layer_factory.hpp:172] Creating layer 'res4a_branch2b/bn' of type 'BatchNorm' I1106 16:38:05.361511 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.361521 13539 net.cpp:200] Created Layer res4a_branch2b/bn (28) I1106 16:38:05.361527 13539 net.cpp:572] res4a_branch2b/bn <- res4a_branch2b I1106 16:38:05.361533 13539 net.cpp:527] res4a_branch2b/bn -> res4a_branch2b (in-place) I1106 16:38:05.361743 13539 net.cpp:260] Setting up res4a_branch2b/bn I1106 16:38:05.361754 13539 net.cpp:267] TEST Top shape for layer 28 'res4a_branch2b/bn' 8 256 20 48 (1966080) I1106 16:38:05.361765 13539 layer_factory.hpp:172] Creating layer 'res4a_branch2b/relu' of type 'ReLU' I1106 16:38:05.361773 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.361779 13539 net.cpp:200] Created Layer res4a_branch2b/relu (29) I1106 16:38:05.361786 13539 net.cpp:572] res4a_branch2b/relu <- res4a_branch2b I1106 16:38:05.361793 13539 net.cpp:527] res4a_branch2b/relu -> res4a_branch2b (in-place) I1106 16:38:05.361801 13539 net.cpp:260] Setting up res4a_branch2b/relu I1106 16:38:05.361809 13539 net.cpp:267] TEST Top shape for layer 29 'res4a_branch2b/relu' 8 256 20 48 (1966080) I1106 16:38:05.361814 13539 layer_factory.hpp:172] Creating layer 'res4a_branch2b_res4a_branch2b/relu_0_split' of type 'Split' I1106 16:38:05.361821 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.361829 13539 net.cpp:200] Created Layer res4a_branch2b_res4a_branch2b/relu_0_split (30) I1106 16:38:05.361836 13539 net.cpp:572] res4a_branch2b_res4a_branch2b/relu_0_split <- res4a_branch2b I1106 16:38:05.361842 13539 net.cpp:542] res4a_branch2b_res4a_branch2b/relu_0_split -> res4a_branch2b_res4a_branch2b/relu_0_split_0 I1106 16:38:05.361850 13539 net.cpp:542] res4a_branch2b_res4a_branch2b/relu_0_split -> res4a_branch2b_res4a_branch2b/relu_0_split_1 I1106 16:38:05.361878 13539 net.cpp:260] Setting up res4a_branch2b_res4a_branch2b/relu_0_split I1106 16:38:05.361887 13539 net.cpp:267] TEST Top shape for layer 30 'res4a_branch2b_res4a_branch2b/relu_0_split' 8 256 20 48 (1966080) I1106 16:38:05.361894 13539 net.cpp:267] TEST Top shape for layer 30 'res4a_branch2b_res4a_branch2b/relu_0_split' 8 256 20 48 (1966080) I1106 16:38:05.361901 13539 layer_factory.hpp:172] Creating layer 'pool4' of type 'Pooling' I1106 16:38:05.361908 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.361917 13539 net.cpp:200] Created Layer pool4 (31) I1106 16:38:05.361924 13539 net.cpp:572] pool4 <- res4a_branch2b_res4a_branch2b/relu_0_split_0 I1106 16:38:05.361934 13539 net.cpp:542] pool4 -> pool4 I1106 16:38:05.361976 13539 net.cpp:260] Setting up pool4 I1106 16:38:05.361987 13539 net.cpp:267] TEST Top shape for layer 31 'pool4' 8 256 10 24 (491520) I1106 16:38:05.361994 13539 layer_factory.hpp:172] Creating layer 'res5a_branch2a' of type 'Convolution' I1106 16:38:05.362000 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.362015 13539 net.cpp:200] Created Layer res5a_branch2a (32) I1106 16:38:05.362021 13539 net.cpp:572] res5a_branch2a <- pool4 I1106 16:38:05.362027 13539 net.cpp:542] res5a_branch2a -> res5a_branch2a I1106 16:38:05.371847 13539 net.cpp:260] Setting up res5a_branch2a I1106 16:38:05.371888 13539 net.cpp:267] TEST Top shape for layer 32 'res5a_branch2a' 8 512 10 24 (983040) I1106 16:38:05.371901 13539 layer_factory.hpp:172] Creating layer 'res5a_branch2a/bn' of type 'BatchNorm' I1106 16:38:05.371909 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.371922 13539 net.cpp:200] Created Layer res5a_branch2a/bn (33) I1106 16:38:05.371928 13539 net.cpp:572] res5a_branch2a/bn <- res5a_branch2a I1106 16:38:05.371937 13539 net.cpp:527] res5a_branch2a/bn -> res5a_branch2a (in-place) I1106 16:38:05.372159 13539 net.cpp:260] Setting up res5a_branch2a/bn I1106 16:38:05.372165 13539 net.cpp:267] TEST Top shape for layer 33 'res5a_branch2a/bn' 8 512 10 24 (983040) I1106 16:38:05.372171 13539 layer_factory.hpp:172] Creating layer 'res5a_branch2a/relu' of type 'ReLU' I1106 16:38:05.372174 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.372177 13539 net.cpp:200] Created Layer res5a_branch2a/relu (34) I1106 16:38:05.372179 13539 net.cpp:572] res5a_branch2a/relu <- res5a_branch2a I1106 16:38:05.372182 13539 net.cpp:527] res5a_branch2a/relu -> res5a_branch2a (in-place) I1106 16:38:05.372186 13539 net.cpp:260] Setting up res5a_branch2a/relu I1106 16:38:05.372189 13539 net.cpp:267] TEST Top shape for layer 34 'res5a_branch2a/relu' 8 512 10 24 (983040) I1106 16:38:05.372191 13539 layer_factory.hpp:172] Creating layer 'res5a_branch2b' of type 'Convolution' I1106 16:38:05.372193 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.372202 13539 net.cpp:200] Created Layer res5a_branch2b (35) I1106 16:38:05.372205 13539 net.cpp:572] res5a_branch2b <- res5a_branch2a I1106 16:38:05.372208 13539 net.cpp:542] res5a_branch2b -> res5a_branch2b I1106 16:38:05.377389 13539 net.cpp:260] Setting up res5a_branch2b I1106 16:38:05.377485 13539 net.cpp:267] TEST Top shape for layer 35 'res5a_branch2b' 8 512 10 24 (983040) I1106 16:38:05.377522 13539 layer_factory.hpp:172] Creating layer 'res5a_branch2b/bn' of type 'BatchNorm' I1106 16:38:05.377533 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.377554 13539 net.cpp:200] Created Layer res5a_branch2b/bn (36) I1106 16:38:05.377563 13539 net.cpp:572] res5a_branch2b/bn <- res5a_branch2b I1106 16:38:05.377573 13539 net.cpp:527] res5a_branch2b/bn -> res5a_branch2b (in-place) I1106 16:38:05.377930 13539 net.cpp:260] Setting up res5a_branch2b/bn I1106 16:38:05.377938 13539 net.cpp:267] TEST Top shape for layer 36 'res5a_branch2b/bn' 8 512 10 24 (983040) I1106 16:38:05.377943 13539 layer_factory.hpp:172] Creating layer 'res5a_branch2b/relu' of type 'ReLU' I1106 16:38:05.377948 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.377952 13539 net.cpp:200] Created Layer res5a_branch2b/relu (37) I1106 16:38:05.377955 13539 net.cpp:572] res5a_branch2b/relu <- res5a_branch2b I1106 16:38:05.377957 13539 net.cpp:527] res5a_branch2b/relu -> res5a_branch2b (in-place) I1106 16:38:05.377965 13539 net.cpp:260] Setting up res5a_branch2b/relu I1106 16:38:05.377970 13539 net.cpp:267] TEST Top shape for layer 37 'res5a_branch2b/relu' 8 512 10 24 (983040) I1106 16:38:05.377974 13539 layer_factory.hpp:172] Creating layer 'res5a_branch2b_res5a_branch2b/relu_0_split' of type 'Split' I1106 16:38:05.378013 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.378023 13539 net.cpp:200] Created Layer res5a_branch2b_res5a_branch2b/relu_0_split (38) I1106 16:38:05.378031 13539 net.cpp:572] res5a_branch2b_res5a_branch2b/relu_0_split <- res5a_branch2b I1106 16:38:05.378036 13539 net.cpp:542] res5a_branch2b_res5a_branch2b/relu_0_split -> res5a_branch2b_res5a_branch2b/relu_0_split_0 I1106 16:38:05.378046 13539 net.cpp:542] res5a_branch2b_res5a_branch2b/relu_0_split -> res5a_branch2b_res5a_branch2b/relu_0_split_1 I1106 16:38:05.378078 13539 net.cpp:260] Setting up res5a_branch2b_res5a_branch2b/relu_0_split I1106 16:38:05.378082 13539 net.cpp:267] TEST Top shape for layer 38 'res5a_branch2b_res5a_branch2b/relu_0_split' 8 512 10 24 (983040) I1106 16:38:05.378085 13539 net.cpp:267] TEST Top shape for layer 38 'res5a_branch2b_res5a_branch2b/relu_0_split' 8 512 10 24 (983040) I1106 16:38:05.378087 13539 layer_factory.hpp:172] Creating layer 'pool6' of type 'Pooling' I1106 16:38:05.378091 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.378101 13539 net.cpp:200] Created Layer pool6 (39) I1106 16:38:05.378103 13539 net.cpp:572] pool6 <- res5a_branch2b_res5a_branch2b/relu_0_split_0 I1106 16:38:05.378108 13539 net.cpp:542] pool6 -> pool6 I1106 16:38:05.378157 13539 net.cpp:260] Setting up pool6 I1106 16:38:05.378161 13539 net.cpp:267] TEST Top shape for layer 39 'pool6' 8 512 5 12 (245760) I1106 16:38:05.378165 13539 layer_factory.hpp:172] Creating layer 'pool6_pool6_0_split' of type 'Split' I1106 16:38:05.378168 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.378172 13539 net.cpp:200] Created Layer pool6_pool6_0_split (40) I1106 16:38:05.378176 13539 net.cpp:572] pool6_pool6_0_split <- pool6 I1106 16:38:05.378178 13539 net.cpp:542] pool6_pool6_0_split -> pool6_pool6_0_split_0 I1106 16:38:05.378181 13539 net.cpp:542] pool6_pool6_0_split -> pool6_pool6_0_split_1 I1106 16:38:05.378203 13539 net.cpp:260] Setting up pool6_pool6_0_split I1106 16:38:05.378212 13539 net.cpp:267] TEST Top shape for layer 40 'pool6_pool6_0_split' 8 512 5 12 (245760) I1106 16:38:05.378218 13539 net.cpp:267] TEST Top shape for layer 40 'pool6_pool6_0_split' 8 512 5 12 (245760) I1106 16:38:05.378223 13539 layer_factory.hpp:172] Creating layer 'pool7' of type 'Pooling' I1106 16:38:05.378227 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.378232 13539 net.cpp:200] Created Layer pool7 (41) I1106 16:38:05.378239 13539 net.cpp:572] pool7 <- pool6_pool6_0_split_0 I1106 16:38:05.378245 13539 net.cpp:542] pool7 -> pool7 I1106 16:38:05.378275 13539 net.cpp:260] Setting up pool7 I1106 16:38:05.378279 13539 net.cpp:267] TEST Top shape for layer 41 'pool7' 8 512 3 6 (73728) I1106 16:38:05.378288 13539 layer_factory.hpp:172] Creating layer 'pool7_pool7_0_split' of type 'Split' I1106 16:38:05.378291 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.378295 13539 net.cpp:200] Created Layer pool7_pool7_0_split (42) I1106 16:38:05.378301 13539 net.cpp:572] pool7_pool7_0_split <- pool7 I1106 16:38:05.378307 13539 net.cpp:542] pool7_pool7_0_split -> pool7_pool7_0_split_0 I1106 16:38:05.378314 13539 net.cpp:542] pool7_pool7_0_split -> pool7_pool7_0_split_1 I1106 16:38:05.378340 13539 net.cpp:260] Setting up pool7_pool7_0_split I1106 16:38:05.378345 13539 net.cpp:267] TEST Top shape for layer 42 'pool7_pool7_0_split' 8 512 3 6 (73728) I1106 16:38:05.378346 13539 net.cpp:267] TEST Top shape for layer 42 'pool7_pool7_0_split' 8 512 3 6 (73728) I1106 16:38:05.378355 13539 layer_factory.hpp:172] Creating layer 'pool8' of type 'Pooling' I1106 16:38:05.378360 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.378374 13539 net.cpp:200] Created Layer pool8 (43) I1106 16:38:05.378377 13539 net.cpp:572] pool8 <- pool7_pool7_0_split_0 I1106 16:38:05.378387 13539 net.cpp:542] pool8 -> pool8 I1106 16:38:05.378424 13539 net.cpp:260] Setting up pool8 I1106 16:38:05.378429 13539 net.cpp:267] TEST Top shape for layer 43 'pool8' 8 512 2 3 (24576) I1106 16:38:05.378438 13539 layer_factory.hpp:172] Creating layer 'ctx_output1' of type 'Convolution' I1106 16:38:05.378441 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.378464 13539 net.cpp:200] Created Layer ctx_output1 (44) I1106 16:38:05.378468 13539 net.cpp:572] ctx_output1 <- res4a_branch2b_res4a_branch2b/relu_0_split_1 I1106 16:38:05.378473 13539 net.cpp:542] ctx_output1 -> ctx_output1 I1106 16:38:05.379216 13539 net.cpp:260] Setting up ctx_output1 I1106 16:38:05.379222 13539 net.cpp:267] TEST Top shape for layer 44 'ctx_output1' 8 256 20 48 (1966080) I1106 16:38:05.379237 13539 layer_factory.hpp:172] Creating layer 'ctx_output1/relu' of type 'ReLU' I1106 16:38:05.379240 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.379245 13539 net.cpp:200] Created Layer ctx_output1/relu (45) I1106 16:38:05.379247 13539 net.cpp:572] ctx_output1/relu <- ctx_output1 I1106 16:38:05.379251 13539 net.cpp:527] ctx_output1/relu -> ctx_output1 (in-place) I1106 16:38:05.379256 13539 net.cpp:260] Setting up ctx_output1/relu I1106 16:38:05.379259 13539 net.cpp:267] TEST Top shape for layer 45 'ctx_output1/relu' 8 256 20 48 (1966080) I1106 16:38:05.379262 13539 layer_factory.hpp:172] Creating layer 'ctx_output1_ctx_output1/relu_0_split' of type 'Split' I1106 16:38:05.379272 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.379276 13539 net.cpp:200] Created Layer ctx_output1_ctx_output1/relu_0_split (46) I1106 16:38:05.379277 13539 net.cpp:572] ctx_output1_ctx_output1/relu_0_split <- ctx_output1 I1106 16:38:05.379281 13539 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_0 I1106 16:38:05.379285 13539 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_1 I1106 16:38:05.379288 13539 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_2 I1106 16:38:05.379338 13539 net.cpp:260] Setting up ctx_output1_ctx_output1/relu_0_split I1106 16:38:05.379343 13539 net.cpp:267] TEST Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 8 256 20 48 (1966080) I1106 16:38:05.379346 13539 net.cpp:267] TEST Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 8 256 20 48 (1966080) I1106 16:38:05.379348 13539 net.cpp:267] TEST Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 8 256 20 48 (1966080) I1106 16:38:05.379357 13539 layer_factory.hpp:172] Creating layer 'ctx_output2' of type 'Convolution' I1106 16:38:05.379360 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.379371 13539 net.cpp:200] Created Layer ctx_output2 (47) I1106 16:38:05.379374 13539 net.cpp:572] ctx_output2 <- res5a_branch2b_res5a_branch2b/relu_0_split_1 I1106 16:38:05.379377 13539 net.cpp:542] ctx_output2 -> ctx_output2 I1106 16:38:05.380548 13539 net.cpp:260] Setting up ctx_output2 I1106 16:38:05.380558 13539 net.cpp:267] TEST Top shape for layer 47 'ctx_output2' 8 256 10 24 (491520) I1106 16:38:05.380564 13539 layer_factory.hpp:172] Creating layer 'ctx_output2/relu' of type 'ReLU' I1106 16:38:05.380574 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.380580 13539 net.cpp:200] Created Layer ctx_output2/relu (48) I1106 16:38:05.380587 13539 net.cpp:572] ctx_output2/relu <- ctx_output2 I1106 16:38:05.380591 13539 net.cpp:527] ctx_output2/relu -> ctx_output2 (in-place) I1106 16:38:05.380596 13539 net.cpp:260] Setting up ctx_output2/relu I1106 16:38:05.380599 13539 net.cpp:267] TEST Top shape for layer 48 'ctx_output2/relu' 8 256 10 24 (491520) I1106 16:38:05.380602 13539 layer_factory.hpp:172] Creating layer 'ctx_output2_ctx_output2/relu_0_split' of type 'Split' I1106 16:38:05.380617 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.380621 13539 net.cpp:200] Created Layer ctx_output2_ctx_output2/relu_0_split (49) I1106 16:38:05.380623 13539 net.cpp:572] ctx_output2_ctx_output2/relu_0_split <- ctx_output2 I1106 16:38:05.380626 13539 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_0 I1106 16:38:05.380632 13539 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_1 I1106 16:38:05.380635 13539 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_2 I1106 16:38:05.380673 13539 net.cpp:260] Setting up ctx_output2_ctx_output2/relu_0_split I1106 16:38:05.380678 13539 net.cpp:267] TEST Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 8 256 10 24 (491520) I1106 16:38:05.380681 13539 net.cpp:267] TEST Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 8 256 10 24 (491520) I1106 16:38:05.380689 13539 net.cpp:267] TEST Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 8 256 10 24 (491520) I1106 16:38:05.380693 13539 layer_factory.hpp:172] Creating layer 'ctx_output3' of type 'Convolution' I1106 16:38:05.380697 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.380707 13539 net.cpp:200] Created Layer ctx_output3 (50) I1106 16:38:05.380712 13539 net.cpp:572] ctx_output3 <- pool6_pool6_0_split_1 I1106 16:38:05.380714 13539 net.cpp:542] ctx_output3 -> ctx_output3 I1106 16:38:05.381839 13539 net.cpp:260] Setting up ctx_output3 I1106 16:38:05.381846 13539 net.cpp:267] TEST Top shape for layer 50 'ctx_output3' 8 256 5 12 (122880) I1106 16:38:05.381850 13539 layer_factory.hpp:172] Creating layer 'ctx_output3/relu' of type 'ReLU' I1106 16:38:05.381852 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.381856 13539 net.cpp:200] Created Layer ctx_output3/relu (51) I1106 16:38:05.381860 13539 net.cpp:572] ctx_output3/relu <- ctx_output3 I1106 16:38:05.381862 13539 net.cpp:527] ctx_output3/relu -> ctx_output3 (in-place) I1106 16:38:05.381866 13539 net.cpp:260] Setting up ctx_output3/relu I1106 16:38:05.381871 13539 net.cpp:267] TEST Top shape for layer 51 'ctx_output3/relu' 8 256 5 12 (122880) I1106 16:38:05.381873 13539 layer_factory.hpp:172] Creating layer 'ctx_output3_ctx_output3/relu_0_split' of type 'Split' I1106 16:38:05.381876 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.381880 13539 net.cpp:200] Created Layer ctx_output3_ctx_output3/relu_0_split (52) I1106 16:38:05.381881 13539 net.cpp:572] ctx_output3_ctx_output3/relu_0_split <- ctx_output3 I1106 16:38:05.381884 13539 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_0 I1106 16:38:05.381888 13539 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_1 I1106 16:38:05.381891 13539 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_2 I1106 16:38:05.381925 13539 net.cpp:260] Setting up ctx_output3_ctx_output3/relu_0_split I1106 16:38:05.381932 13539 net.cpp:267] TEST Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 8 256 5 12 (122880) I1106 16:38:05.381934 13539 net.cpp:267] TEST Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 8 256 5 12 (122880) I1106 16:38:05.381937 13539 net.cpp:267] TEST Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 8 256 5 12 (122880) I1106 16:38:05.381939 13539 layer_factory.hpp:172] Creating layer 'ctx_output4' of type 'Convolution' I1106 16:38:05.381942 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.381949 13539 net.cpp:200] Created Layer ctx_output4 (53) I1106 16:38:05.381953 13539 net.cpp:572] ctx_output4 <- pool7_pool7_0_split_1 I1106 16:38:05.381956 13539 net.cpp:542] ctx_output4 -> ctx_output4 I1106 16:38:05.383710 13539 net.cpp:260] Setting up ctx_output4 I1106 16:38:05.383719 13539 net.cpp:267] TEST Top shape for layer 53 'ctx_output4' 8 256 3 6 (36864) I1106 16:38:05.383724 13539 layer_factory.hpp:172] Creating layer 'ctx_output4/relu' of type 'ReLU' I1106 16:38:05.383728 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.383733 13539 net.cpp:200] Created Layer ctx_output4/relu (54) I1106 16:38:05.383735 13539 net.cpp:572] ctx_output4/relu <- ctx_output4 I1106 16:38:05.383738 13539 net.cpp:527] ctx_output4/relu -> ctx_output4 (in-place) I1106 16:38:05.383744 13539 net.cpp:260] Setting up ctx_output4/relu I1106 16:38:05.383746 13539 net.cpp:267] TEST Top shape for layer 54 'ctx_output4/relu' 8 256 3 6 (36864) I1106 16:38:05.383749 13539 layer_factory.hpp:172] Creating layer 'ctx_output4_ctx_output4/relu_0_split' of type 'Split' I1106 16:38:05.383751 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.383756 13539 net.cpp:200] Created Layer ctx_output4_ctx_output4/relu_0_split (55) I1106 16:38:05.383759 13539 net.cpp:572] ctx_output4_ctx_output4/relu_0_split <- ctx_output4 I1106 16:38:05.383761 13539 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_0 I1106 16:38:05.383766 13539 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_1 I1106 16:38:05.383770 13539 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_2 I1106 16:38:05.383805 13539 net.cpp:260] Setting up ctx_output4_ctx_output4/relu_0_split I1106 16:38:05.383808 13539 net.cpp:267] TEST Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 8 256 3 6 (36864) I1106 16:38:05.383810 13539 net.cpp:267] TEST Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 8 256 3 6 (36864) I1106 16:38:05.383813 13539 net.cpp:267] TEST Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 8 256 3 6 (36864) I1106 16:38:05.383816 13539 layer_factory.hpp:172] Creating layer 'ctx_output5' of type 'Convolution' I1106 16:38:05.383818 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.383827 13539 net.cpp:200] Created Layer ctx_output5 (56) I1106 16:38:05.383831 13539 net.cpp:572] ctx_output5 <- pool8 I1106 16:38:05.383833 13539 net.cpp:542] ctx_output5 -> ctx_output5 I1106 16:38:05.384887 13539 net.cpp:260] Setting up ctx_output5 I1106 16:38:05.384896 13539 net.cpp:267] TEST Top shape for layer 56 'ctx_output5' 8 256 2 3 (12288) I1106 16:38:05.384902 13539 layer_factory.hpp:172] Creating layer 'ctx_output5/relu' of type 'ReLU' I1106 16:38:05.384904 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.384909 13539 net.cpp:200] Created Layer ctx_output5/relu (57) I1106 16:38:05.384912 13539 net.cpp:572] ctx_output5/relu <- ctx_output5 I1106 16:38:05.384915 13539 net.cpp:527] ctx_output5/relu -> ctx_output5 (in-place) I1106 16:38:05.384922 13539 net.cpp:260] Setting up ctx_output5/relu I1106 16:38:05.384924 13539 net.cpp:267] TEST Top shape for layer 57 'ctx_output5/relu' 8 256 2 3 (12288) I1106 16:38:05.384938 13539 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc' of type 'Convolution' I1106 16:38:05.384941 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.384955 13539 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc (58) I1106 16:38:05.384958 13539 net.cpp:572] ctx_output1/relu_mbox_loc <- ctx_output1_ctx_output1/relu_0_split_0 I1106 16:38:05.384963 13539 net.cpp:542] ctx_output1/relu_mbox_loc -> ctx_output1/relu_mbox_loc I1106 16:38:05.385165 13539 net.cpp:260] Setting up ctx_output1/relu_mbox_loc I1106 16:38:05.385171 13539 net.cpp:267] TEST Top shape for layer 58 'ctx_output1/relu_mbox_loc' 8 16 20 48 (122880) I1106 16:38:05.385175 13539 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:05.385188 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.385195 13539 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc_perm (59) I1106 16:38:05.385198 13539 net.cpp:572] ctx_output1/relu_mbox_loc_perm <- ctx_output1/relu_mbox_loc I1106 16:38:05.385201 13539 net.cpp:542] ctx_output1/relu_mbox_loc_perm -> ctx_output1/relu_mbox_loc_perm I1106 16:38:05.385265 13539 net.cpp:260] Setting up ctx_output1/relu_mbox_loc_perm I1106 16:38:05.385270 13539 net.cpp:267] TEST Top shape for layer 59 'ctx_output1/relu_mbox_loc_perm' 8 20 48 16 (122880) I1106 16:38:05.385272 13539 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:05.385274 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.385278 13539 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc_flat (60) I1106 16:38:05.385282 13539 net.cpp:572] ctx_output1/relu_mbox_loc_flat <- ctx_output1/relu_mbox_loc_perm I1106 16:38:05.385284 13539 net.cpp:542] ctx_output1/relu_mbox_loc_flat -> ctx_output1/relu_mbox_loc_flat I1106 16:38:05.385358 13539 net.cpp:260] Setting up ctx_output1/relu_mbox_loc_flat I1106 16:38:05.385363 13539 net.cpp:267] TEST Top shape for layer 60 'ctx_output1/relu_mbox_loc_flat' 8 15360 (122880) I1106 16:38:05.385367 13539 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf' of type 'Convolution' I1106 16:38:05.385370 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.385377 13539 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf (61) I1106 16:38:05.385380 13539 net.cpp:572] ctx_output1/relu_mbox_conf <- ctx_output1_ctx_output1/relu_0_split_1 I1106 16:38:05.385383 13539 net.cpp:542] ctx_output1/relu_mbox_conf -> ctx_output1/relu_mbox_conf I1106 16:38:05.385545 13539 net.cpp:260] Setting up ctx_output1/relu_mbox_conf I1106 16:38:05.385550 13539 net.cpp:267] TEST Top shape for layer 61 'ctx_output1/relu_mbox_conf' 8 8 20 48 (61440) I1106 16:38:05.385555 13539 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:05.385558 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.385563 13539 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf_perm (62) I1106 16:38:05.385566 13539 net.cpp:572] ctx_output1/relu_mbox_conf_perm <- ctx_output1/relu_mbox_conf I1106 16:38:05.385569 13539 net.cpp:542] ctx_output1/relu_mbox_conf_perm -> ctx_output1/relu_mbox_conf_perm I1106 16:38:05.385627 13539 net.cpp:260] Setting up ctx_output1/relu_mbox_conf_perm I1106 16:38:05.385630 13539 net.cpp:267] TEST Top shape for layer 62 'ctx_output1/relu_mbox_conf_perm' 8 20 48 8 (61440) I1106 16:38:05.385633 13539 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:05.385635 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.385641 13539 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf_flat (63) I1106 16:38:05.385643 13539 net.cpp:572] ctx_output1/relu_mbox_conf_flat <- ctx_output1/relu_mbox_conf_perm I1106 16:38:05.385646 13539 net.cpp:542] ctx_output1/relu_mbox_conf_flat -> ctx_output1/relu_mbox_conf_flat I1106 16:38:05.385687 13539 net.cpp:260] Setting up ctx_output1/relu_mbox_conf_flat I1106 16:38:05.385692 13539 net.cpp:267] TEST Top shape for layer 63 'ctx_output1/relu_mbox_conf_flat' 8 7680 (61440) I1106 16:38:05.385695 13539 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:05.385697 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.385707 13539 net.cpp:200] Created Layer ctx_output1/relu_mbox_priorbox (64) I1106 16:38:05.385710 13539 net.cpp:572] ctx_output1/relu_mbox_priorbox <- ctx_output1_ctx_output1/relu_0_split_2 I1106 16:38:05.385713 13539 net.cpp:572] ctx_output1/relu_mbox_priorbox <- data_data_0_split_1 I1106 16:38:05.385725 13539 net.cpp:542] ctx_output1/relu_mbox_priorbox -> ctx_output1/relu_mbox_priorbox I1106 16:38:05.385741 13539 net.cpp:260] Setting up ctx_output1/relu_mbox_priorbox I1106 16:38:05.385746 13539 net.cpp:267] TEST Top shape for layer 64 'ctx_output1/relu_mbox_priorbox' 1 2 15360 (30720) I1106 16:38:05.385748 13539 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc' of type 'Convolution' I1106 16:38:05.385751 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.385761 13539 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc (65) I1106 16:38:05.385763 13539 net.cpp:572] ctx_output2/relu_mbox_loc <- ctx_output2_ctx_output2/relu_0_split_0 I1106 16:38:05.385766 13539 net.cpp:542] ctx_output2/relu_mbox_loc -> ctx_output2/relu_mbox_loc I1106 16:38:05.385955 13539 net.cpp:260] Setting up ctx_output2/relu_mbox_loc I1106 16:38:05.385962 13539 net.cpp:267] TEST Top shape for layer 65 'ctx_output2/relu_mbox_loc' 8 24 10 24 (46080) I1106 16:38:05.385967 13539 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:05.385969 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.385974 13539 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc_perm (66) I1106 16:38:05.385977 13539 net.cpp:572] ctx_output2/relu_mbox_loc_perm <- ctx_output2/relu_mbox_loc I1106 16:38:05.385982 13539 net.cpp:542] ctx_output2/relu_mbox_loc_perm -> ctx_output2/relu_mbox_loc_perm I1106 16:38:05.386037 13539 net.cpp:260] Setting up ctx_output2/relu_mbox_loc_perm I1106 16:38:05.386042 13539 net.cpp:267] TEST Top shape for layer 66 'ctx_output2/relu_mbox_loc_perm' 8 10 24 24 (46080) I1106 16:38:05.386044 13539 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:05.386047 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.386050 13539 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc_flat (67) I1106 16:38:05.386054 13539 net.cpp:572] ctx_output2/relu_mbox_loc_flat <- ctx_output2/relu_mbox_loc_perm I1106 16:38:05.386055 13539 net.cpp:542] ctx_output2/relu_mbox_loc_flat -> ctx_output2/relu_mbox_loc_flat I1106 16:38:05.386827 13539 net.cpp:260] Setting up ctx_output2/relu_mbox_loc_flat I1106 16:38:05.386835 13539 net.cpp:267] TEST Top shape for layer 67 'ctx_output2/relu_mbox_loc_flat' 8 5760 (46080) I1106 16:38:05.386839 13539 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf' of type 'Convolution' I1106 16:38:05.386842 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.386853 13539 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf (68) I1106 16:38:05.386857 13539 net.cpp:572] ctx_output2/relu_mbox_conf <- ctx_output2_ctx_output2/relu_0_split_1 I1106 16:38:05.386862 13539 net.cpp:542] ctx_output2/relu_mbox_conf -> ctx_output2/relu_mbox_conf I1106 16:38:05.387046 13539 net.cpp:260] Setting up ctx_output2/relu_mbox_conf I1106 16:38:05.387053 13539 net.cpp:267] TEST Top shape for layer 68 'ctx_output2/relu_mbox_conf' 8 12 10 24 (23040) I1106 16:38:05.387058 13539 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:05.387060 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.387066 13539 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf_perm (69) I1106 16:38:05.387069 13539 net.cpp:572] ctx_output2/relu_mbox_conf_perm <- ctx_output2/relu_mbox_conf I1106 16:38:05.387073 13539 net.cpp:542] ctx_output2/relu_mbox_conf_perm -> ctx_output2/relu_mbox_conf_perm I1106 16:38:05.387131 13539 net.cpp:260] Setting up ctx_output2/relu_mbox_conf_perm I1106 16:38:05.387135 13539 net.cpp:267] TEST Top shape for layer 69 'ctx_output2/relu_mbox_conf_perm' 8 10 24 12 (23040) I1106 16:38:05.387138 13539 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:05.387149 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.387153 13539 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf_flat (70) I1106 16:38:05.387156 13539 net.cpp:572] ctx_output2/relu_mbox_conf_flat <- ctx_output2/relu_mbox_conf_perm I1106 16:38:05.387161 13539 net.cpp:542] ctx_output2/relu_mbox_conf_flat -> ctx_output2/relu_mbox_conf_flat I1106 16:38:05.387835 13539 net.cpp:260] Setting up ctx_output2/relu_mbox_conf_flat I1106 16:38:05.387842 13539 net.cpp:267] TEST Top shape for layer 70 'ctx_output2/relu_mbox_conf_flat' 8 2880 (23040) I1106 16:38:05.387845 13539 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:05.387850 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.387854 13539 net.cpp:200] Created Layer ctx_output2/relu_mbox_priorbox (71) I1106 16:38:05.387857 13539 net.cpp:572] ctx_output2/relu_mbox_priorbox <- ctx_output2_ctx_output2/relu_0_split_2 I1106 16:38:05.387861 13539 net.cpp:572] ctx_output2/relu_mbox_priorbox <- data_data_0_split_2 I1106 16:38:05.387866 13539 net.cpp:542] ctx_output2/relu_mbox_priorbox -> ctx_output2/relu_mbox_priorbox I1106 16:38:05.387884 13539 net.cpp:260] Setting up ctx_output2/relu_mbox_priorbox I1106 16:38:05.387887 13539 net.cpp:267] TEST Top shape for layer 71 'ctx_output2/relu_mbox_priorbox' 1 2 5760 (11520) I1106 16:38:05.387890 13539 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc' of type 'Convolution' I1106 16:38:05.387893 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.387903 13539 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc (72) I1106 16:38:05.387907 13539 net.cpp:572] ctx_output3/relu_mbox_loc <- ctx_output3_ctx_output3/relu_0_split_0 I1106 16:38:05.387909 13539 net.cpp:542] ctx_output3/relu_mbox_loc -> ctx_output3/relu_mbox_loc I1106 16:38:05.388132 13539 net.cpp:260] Setting up ctx_output3/relu_mbox_loc I1106 16:38:05.388149 13539 net.cpp:267] TEST Top shape for layer 72 'ctx_output3/relu_mbox_loc' 8 24 5 12 (11520) I1106 16:38:05.388157 13539 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:05.388164 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.388173 13539 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc_perm (73) I1106 16:38:05.388180 13539 net.cpp:572] ctx_output3/relu_mbox_loc_perm <- ctx_output3/relu_mbox_loc I1106 16:38:05.388185 13539 net.cpp:542] ctx_output3/relu_mbox_loc_perm -> ctx_output3/relu_mbox_loc_perm I1106 16:38:05.388258 13539 net.cpp:260] Setting up ctx_output3/relu_mbox_loc_perm I1106 16:38:05.388267 13539 net.cpp:267] TEST Top shape for layer 73 'ctx_output3/relu_mbox_loc_perm' 8 5 12 24 (11520) I1106 16:38:05.388273 13539 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:05.388279 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.388285 13539 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc_flat (74) I1106 16:38:05.388291 13539 net.cpp:572] ctx_output3/relu_mbox_loc_flat <- ctx_output3/relu_mbox_loc_perm I1106 16:38:05.388298 13539 net.cpp:542] ctx_output3/relu_mbox_loc_flat -> ctx_output3/relu_mbox_loc_flat I1106 16:38:05.388739 13539 net.cpp:260] Setting up ctx_output3/relu_mbox_loc_flat I1106 16:38:05.388747 13539 net.cpp:267] TEST Top shape for layer 74 'ctx_output3/relu_mbox_loc_flat' 8 1440 (11520) I1106 16:38:05.388751 13539 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf' of type 'Convolution' I1106 16:38:05.388752 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.388762 13539 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf (75) I1106 16:38:05.388764 13539 net.cpp:572] ctx_output3/relu_mbox_conf <- ctx_output3_ctx_output3/relu_0_split_1 I1106 16:38:05.388777 13539 net.cpp:542] ctx_output3/relu_mbox_conf -> ctx_output3/relu_mbox_conf I1106 16:38:05.388957 13539 net.cpp:260] Setting up ctx_output3/relu_mbox_conf I1106 16:38:05.388962 13539 net.cpp:267] TEST Top shape for layer 75 'ctx_output3/relu_mbox_conf' 8 12 5 12 (5760) I1106 16:38:05.388967 13539 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:05.388969 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.388974 13539 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf_perm (76) I1106 16:38:05.388976 13539 net.cpp:572] ctx_output3/relu_mbox_conf_perm <- ctx_output3/relu_mbox_conf I1106 16:38:05.388979 13539 net.cpp:542] ctx_output3/relu_mbox_conf_perm -> ctx_output3/relu_mbox_conf_perm I1106 16:38:05.389036 13539 net.cpp:260] Setting up ctx_output3/relu_mbox_conf_perm I1106 16:38:05.389041 13539 net.cpp:267] TEST Top shape for layer 76 'ctx_output3/relu_mbox_conf_perm' 8 5 12 12 (5760) I1106 16:38:05.389045 13539 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:05.389046 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.389050 13539 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf_flat (77) I1106 16:38:05.389051 13539 net.cpp:572] ctx_output3/relu_mbox_conf_flat <- ctx_output3/relu_mbox_conf_perm I1106 16:38:05.389055 13539 net.cpp:542] ctx_output3/relu_mbox_conf_flat -> ctx_output3/relu_mbox_conf_flat I1106 16:38:05.389086 13539 net.cpp:260] Setting up ctx_output3/relu_mbox_conf_flat I1106 16:38:05.389091 13539 net.cpp:267] TEST Top shape for layer 77 'ctx_output3/relu_mbox_conf_flat' 8 720 (5760) I1106 16:38:05.389093 13539 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:05.389104 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.389111 13539 net.cpp:200] Created Layer ctx_output3/relu_mbox_priorbox (78) I1106 16:38:05.389113 13539 net.cpp:572] ctx_output3/relu_mbox_priorbox <- ctx_output3_ctx_output3/relu_0_split_2 I1106 16:38:05.389118 13539 net.cpp:572] ctx_output3/relu_mbox_priorbox <- data_data_0_split_3 I1106 16:38:05.389122 13539 net.cpp:542] ctx_output3/relu_mbox_priorbox -> ctx_output3/relu_mbox_priorbox I1106 16:38:05.389138 13539 net.cpp:260] Setting up ctx_output3/relu_mbox_priorbox I1106 16:38:05.389142 13539 net.cpp:267] TEST Top shape for layer 78 'ctx_output3/relu_mbox_priorbox' 1 2 1440 (2880) I1106 16:38:05.389145 13539 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc' of type 'Convolution' I1106 16:38:05.389147 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.389156 13539 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc (79) I1106 16:38:05.389160 13539 net.cpp:572] ctx_output4/relu_mbox_loc <- ctx_output4_ctx_output4/relu_0_split_0 I1106 16:38:05.389163 13539 net.cpp:542] ctx_output4/relu_mbox_loc -> ctx_output4/relu_mbox_loc I1106 16:38:05.389340 13539 net.cpp:260] Setting up ctx_output4/relu_mbox_loc I1106 16:38:05.389346 13539 net.cpp:267] TEST Top shape for layer 79 'ctx_output4/relu_mbox_loc' 8 16 3 6 (2304) I1106 16:38:05.389351 13539 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:05.389354 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.389360 13539 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc_perm (80) I1106 16:38:05.389364 13539 net.cpp:572] ctx_output4/relu_mbox_loc_perm <- ctx_output4/relu_mbox_loc I1106 16:38:05.389366 13539 net.cpp:542] ctx_output4/relu_mbox_loc_perm -> ctx_output4/relu_mbox_loc_perm I1106 16:38:05.389425 13539 net.cpp:260] Setting up ctx_output4/relu_mbox_loc_perm I1106 16:38:05.389430 13539 net.cpp:267] TEST Top shape for layer 80 'ctx_output4/relu_mbox_loc_perm' 8 3 6 16 (2304) I1106 16:38:05.389432 13539 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:05.389441 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.389446 13539 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc_flat (81) I1106 16:38:05.389449 13539 net.cpp:572] ctx_output4/relu_mbox_loc_flat <- ctx_output4/relu_mbox_loc_perm I1106 16:38:05.389451 13539 net.cpp:542] ctx_output4/relu_mbox_loc_flat -> ctx_output4/relu_mbox_loc_flat I1106 16:38:05.389485 13539 net.cpp:260] Setting up ctx_output4/relu_mbox_loc_flat I1106 16:38:05.389489 13539 net.cpp:267] TEST Top shape for layer 81 'ctx_output4/relu_mbox_loc_flat' 8 288 (2304) I1106 16:38:05.389492 13539 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf' of type 'Convolution' I1106 16:38:05.389494 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.389503 13539 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf (82) I1106 16:38:05.389506 13539 net.cpp:572] ctx_output4/relu_mbox_conf <- ctx_output4_ctx_output4/relu_0_split_1 I1106 16:38:05.389509 13539 net.cpp:542] ctx_output4/relu_mbox_conf -> ctx_output4/relu_mbox_conf I1106 16:38:05.389686 13539 net.cpp:260] Setting up ctx_output4/relu_mbox_conf I1106 16:38:05.389693 13539 net.cpp:267] TEST Top shape for layer 82 'ctx_output4/relu_mbox_conf' 8 8 3 6 (1152) I1106 16:38:05.389696 13539 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:05.389699 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.389705 13539 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf_perm (83) I1106 16:38:05.389708 13539 net.cpp:572] ctx_output4/relu_mbox_conf_perm <- ctx_output4/relu_mbox_conf I1106 16:38:05.389711 13539 net.cpp:542] ctx_output4/relu_mbox_conf_perm -> ctx_output4/relu_mbox_conf_perm I1106 16:38:05.389767 13539 net.cpp:260] Setting up ctx_output4/relu_mbox_conf_perm I1106 16:38:05.389772 13539 net.cpp:267] TEST Top shape for layer 83 'ctx_output4/relu_mbox_conf_perm' 8 3 6 8 (1152) I1106 16:38:05.389775 13539 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:05.389778 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.389781 13539 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf_flat (84) I1106 16:38:05.389784 13539 net.cpp:572] ctx_output4/relu_mbox_conf_flat <- ctx_output4/relu_mbox_conf_perm I1106 16:38:05.389788 13539 net.cpp:542] ctx_output4/relu_mbox_conf_flat -> ctx_output4/relu_mbox_conf_flat I1106 16:38:05.389822 13539 net.cpp:260] Setting up ctx_output4/relu_mbox_conf_flat I1106 16:38:05.389827 13539 net.cpp:267] TEST Top shape for layer 84 'ctx_output4/relu_mbox_conf_flat' 8 144 (1152) I1106 16:38:05.389829 13539 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:05.389832 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.389837 13539 net.cpp:200] Created Layer ctx_output4/relu_mbox_priorbox (85) I1106 16:38:05.389839 13539 net.cpp:572] ctx_output4/relu_mbox_priorbox <- ctx_output4_ctx_output4/relu_0_split_2 I1106 16:38:05.389842 13539 net.cpp:572] ctx_output4/relu_mbox_priorbox <- data_data_0_split_4 I1106 16:38:05.389847 13539 net.cpp:542] ctx_output4/relu_mbox_priorbox -> ctx_output4/relu_mbox_priorbox I1106 16:38:05.389858 13539 net.cpp:260] Setting up ctx_output4/relu_mbox_priorbox I1106 16:38:05.389863 13539 net.cpp:267] TEST Top shape for layer 85 'ctx_output4/relu_mbox_priorbox' 1 2 288 (576) I1106 16:38:05.389864 13539 layer_factory.hpp:172] Creating layer 'mbox_loc' of type 'Concat' I1106 16:38:05.389868 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.389871 13539 net.cpp:200] Created Layer mbox_loc (86) I1106 16:38:05.389874 13539 net.cpp:572] mbox_loc <- ctx_output1/relu_mbox_loc_flat I1106 16:38:05.389883 13539 net.cpp:572] mbox_loc <- ctx_output2/relu_mbox_loc_flat I1106 16:38:05.389887 13539 net.cpp:572] mbox_loc <- ctx_output3/relu_mbox_loc_flat I1106 16:38:05.389890 13539 net.cpp:572] mbox_loc <- ctx_output4/relu_mbox_loc_flat I1106 16:38:05.389894 13539 net.cpp:542] mbox_loc -> mbox_loc I1106 16:38:05.389909 13539 net.cpp:260] Setting up mbox_loc I1106 16:38:05.389914 13539 net.cpp:267] TEST Top shape for layer 86 'mbox_loc' 8 22848 (182784) I1106 16:38:05.389916 13539 layer_factory.hpp:172] Creating layer 'mbox_conf' of type 'Concat' I1106 16:38:05.389919 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.389922 13539 net.cpp:200] Created Layer mbox_conf (87) I1106 16:38:05.389925 13539 net.cpp:572] mbox_conf <- ctx_output1/relu_mbox_conf_flat I1106 16:38:05.389928 13539 net.cpp:572] mbox_conf <- ctx_output2/relu_mbox_conf_flat I1106 16:38:05.389930 13539 net.cpp:572] mbox_conf <- ctx_output3/relu_mbox_conf_flat I1106 16:38:05.389933 13539 net.cpp:572] mbox_conf <- ctx_output4/relu_mbox_conf_flat I1106 16:38:05.389937 13539 net.cpp:542] mbox_conf -> mbox_conf I1106 16:38:05.389950 13539 net.cpp:260] Setting up mbox_conf I1106 16:38:05.389953 13539 net.cpp:267] TEST Top shape for layer 87 'mbox_conf' 8 11424 (91392) I1106 16:38:05.389955 13539 layer_factory.hpp:172] Creating layer 'mbox_priorbox' of type 'Concat' I1106 16:38:05.389958 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.389961 13539 net.cpp:200] Created Layer mbox_priorbox (88) I1106 16:38:05.389964 13539 net.cpp:572] mbox_priorbox <- ctx_output1/relu_mbox_priorbox I1106 16:38:05.389966 13539 net.cpp:572] mbox_priorbox <- ctx_output2/relu_mbox_priorbox I1106 16:38:05.389969 13539 net.cpp:572] mbox_priorbox <- ctx_output3/relu_mbox_priorbox I1106 16:38:05.389972 13539 net.cpp:572] mbox_priorbox <- ctx_output4/relu_mbox_priorbox I1106 16:38:05.389974 13539 net.cpp:542] mbox_priorbox -> mbox_priorbox I1106 16:38:05.389987 13539 net.cpp:260] Setting up mbox_priorbox I1106 16:38:05.389991 13539 net.cpp:267] TEST Top shape for layer 88 'mbox_priorbox' 1 2 22848 (45696) I1106 16:38:05.389993 13539 layer_factory.hpp:172] Creating layer 'mbox_conf_reshape' of type 'Reshape' I1106 16:38:05.389997 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.390004 13539 net.cpp:200] Created Layer mbox_conf_reshape (89) I1106 16:38:05.390007 13539 net.cpp:572] mbox_conf_reshape <- mbox_conf I1106 16:38:05.390010 13539 net.cpp:542] mbox_conf_reshape -> mbox_conf_reshape I1106 16:38:05.390027 13539 net.cpp:260] Setting up mbox_conf_reshape I1106 16:38:05.390030 13539 net.cpp:267] TEST Top shape for layer 89 'mbox_conf_reshape' 8 5712 2 (91392) I1106 16:38:05.390033 13539 layer_factory.hpp:172] Creating layer 'mbox_conf_softmax' of type 'Softmax' I1106 16:38:05.390036 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.390044 13539 net.cpp:200] Created Layer mbox_conf_softmax (90) I1106 16:38:05.390048 13539 net.cpp:572] mbox_conf_softmax <- mbox_conf_reshape I1106 16:38:05.390050 13539 net.cpp:542] mbox_conf_softmax -> mbox_conf_softmax I1106 16:38:05.390084 13539 net.cpp:260] Setting up mbox_conf_softmax I1106 16:38:05.390087 13539 net.cpp:267] TEST Top shape for layer 90 'mbox_conf_softmax' 8 5712 2 (91392) I1106 16:38:05.390091 13539 layer_factory.hpp:172] Creating layer 'mbox_conf_flatten' of type 'Flatten' I1106 16:38:05.390094 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.390096 13539 net.cpp:200] Created Layer mbox_conf_flatten (91) I1106 16:38:05.390100 13539 net.cpp:572] mbox_conf_flatten <- mbox_conf_softmax I1106 16:38:05.390101 13539 net.cpp:542] mbox_conf_flatten -> mbox_conf_flatten I1106 16:38:05.390151 13539 net.cpp:260] Setting up mbox_conf_flatten I1106 16:38:05.390156 13539 net.cpp:267] TEST Top shape for layer 91 'mbox_conf_flatten' 8 11424 (91392) I1106 16:38:05.390163 13539 layer_factory.hpp:172] Creating layer 'detection_out' of type 'DetectionOutput' I1106 16:38:05.390166 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.390182 13539 net.cpp:200] Created Layer detection_out (92) I1106 16:38:05.390185 13539 net.cpp:572] detection_out <- mbox_loc I1106 16:38:05.390188 13539 net.cpp:572] detection_out <- mbox_conf_flatten I1106 16:38:05.390192 13539 net.cpp:572] detection_out <- mbox_priorbox I1106 16:38:05.390194 13539 net.cpp:542] detection_out -> detection_out I1106 16:38:05.390288 13539 net.cpp:260] Setting up detection_out I1106 16:38:05.390293 13539 net.cpp:267] TEST Top shape for layer 92 'detection_out' 1 1 1 7 (7) I1106 16:38:05.390296 13539 layer_factory.hpp:172] Creating layer 'detection_eval' of type 'DetectionEvaluate' I1106 16:38:05.390300 13539 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.390305 13539 net.cpp:200] Created Layer detection_eval (93) I1106 16:38:05.390308 13539 net.cpp:572] detection_eval <- detection_out I1106 16:38:05.390311 13539 net.cpp:572] detection_eval <- label I1106 16:38:05.390313 13539 net.cpp:542] detection_eval -> detection_eval I1106 16:38:05.390348 13539 net.cpp:260] Setting up detection_eval I1106 16:38:05.390355 13539 net.cpp:267] TEST Top shape for layer 93 'detection_eval' 1 1 2 5 (10) I1106 16:38:05.390357 13539 net.cpp:338] detection_eval does not need backward computation. I1106 16:38:05.390360 13539 net.cpp:338] detection_out does not need backward computation. I1106 16:38:05.390363 13539 net.cpp:338] mbox_conf_flatten does not need backward computation. I1106 16:38:05.390365 13539 net.cpp:338] mbox_conf_softmax does not need backward computation. I1106 16:38:05.390367 13539 net.cpp:338] mbox_conf_reshape does not need backward computation. I1106 16:38:05.390370 13539 net.cpp:338] mbox_priorbox does not need backward computation. I1106 16:38:05.390372 13539 net.cpp:338] mbox_conf does not need backward computation. I1106 16:38:05.390375 13539 net.cpp:338] mbox_loc does not need backward computation. I1106 16:38:05.390378 13539 net.cpp:338] ctx_output4/relu_mbox_priorbox does not need backward computation. I1106 16:38:05.390383 13539 net.cpp:338] ctx_output4/relu_mbox_conf_flat does not need backward computation. I1106 16:38:05.390384 13539 net.cpp:338] ctx_output4/relu_mbox_conf_perm does not need backward computation. I1106 16:38:05.390386 13539 net.cpp:338] ctx_output4/relu_mbox_conf does not need backward computation. I1106 16:38:05.390389 13539 net.cpp:338] ctx_output4/relu_mbox_loc_flat does not need backward computation. I1106 16:38:05.390393 13539 net.cpp:338] ctx_output4/relu_mbox_loc_perm does not need backward computation. I1106 16:38:05.390394 13539 net.cpp:338] ctx_output4/relu_mbox_loc does not need backward computation. I1106 16:38:05.390398 13539 net.cpp:338] ctx_output3/relu_mbox_priorbox does not need backward computation. I1106 16:38:05.390399 13539 net.cpp:338] ctx_output3/relu_mbox_conf_flat does not need backward computation. I1106 16:38:05.390403 13539 net.cpp:338] ctx_output3/relu_mbox_conf_perm does not need backward computation. I1106 16:38:05.390405 13539 net.cpp:338] ctx_output3/relu_mbox_conf does not need backward computation. I1106 16:38:05.390408 13539 net.cpp:338] ctx_output3/relu_mbox_loc_flat does not need backward computation. I1106 16:38:05.390409 13539 net.cpp:338] ctx_output3/relu_mbox_loc_perm does not need backward computation. I1106 16:38:05.390413 13539 net.cpp:338] ctx_output3/relu_mbox_loc does not need backward computation. I1106 16:38:05.390414 13539 net.cpp:338] ctx_output2/relu_mbox_priorbox does not need backward computation. I1106 16:38:05.390417 13539 net.cpp:338] ctx_output2/relu_mbox_conf_flat does not need backward computation. I1106 16:38:05.390420 13539 net.cpp:338] ctx_output2/relu_mbox_conf_perm does not need backward computation. I1106 16:38:05.390424 13539 net.cpp:338] ctx_output2/relu_mbox_conf does not need backward computation. I1106 16:38:05.390431 13539 net.cpp:338] ctx_output2/relu_mbox_loc_flat does not need backward computation. I1106 16:38:05.390434 13539 net.cpp:338] ctx_output2/relu_mbox_loc_perm does not need backward computation. I1106 16:38:05.390436 13539 net.cpp:338] ctx_output2/relu_mbox_loc does not need backward computation. I1106 16:38:05.390439 13539 net.cpp:338] ctx_output1/relu_mbox_priorbox does not need backward computation. I1106 16:38:05.390442 13539 net.cpp:338] ctx_output1/relu_mbox_conf_flat does not need backward computation. I1106 16:38:05.390445 13539 net.cpp:338] ctx_output1/relu_mbox_conf_perm does not need backward computation. I1106 16:38:05.390447 13539 net.cpp:338] ctx_output1/relu_mbox_conf does not need backward computation. I1106 16:38:05.390450 13539 net.cpp:338] ctx_output1/relu_mbox_loc_flat does not need backward computation. I1106 16:38:05.390452 13539 net.cpp:338] ctx_output1/relu_mbox_loc_perm does not need backward computation. I1106 16:38:05.390455 13539 net.cpp:338] ctx_output1/relu_mbox_loc does not need backward computation. I1106 16:38:05.390458 13539 net.cpp:338] ctx_output5/relu does not need backward computation. I1106 16:38:05.390461 13539 net.cpp:338] ctx_output5 does not need backward computation. I1106 16:38:05.390465 13539 net.cpp:338] ctx_output4_ctx_output4/relu_0_split does not need backward computation. I1106 16:38:05.390467 13539 net.cpp:338] ctx_output4/relu does not need backward computation. I1106 16:38:05.390470 13539 net.cpp:338] ctx_output4 does not need backward computation. I1106 16:38:05.390472 13539 net.cpp:338] ctx_output3_ctx_output3/relu_0_split does not need backward computation. I1106 16:38:05.390475 13539 net.cpp:338] ctx_output3/relu does not need backward computation. I1106 16:38:05.390478 13539 net.cpp:338] ctx_output3 does not need backward computation. I1106 16:38:05.390480 13539 net.cpp:338] ctx_output2_ctx_output2/relu_0_split does not need backward computation. I1106 16:38:05.390483 13539 net.cpp:338] ctx_output2/relu does not need backward computation. I1106 16:38:05.390486 13539 net.cpp:338] ctx_output2 does not need backward computation. I1106 16:38:05.390488 13539 net.cpp:338] ctx_output1_ctx_output1/relu_0_split does not need backward computation. I1106 16:38:05.390491 13539 net.cpp:338] ctx_output1/relu does not need backward computation. I1106 16:38:05.390493 13539 net.cpp:338] ctx_output1 does not need backward computation. I1106 16:38:05.390496 13539 net.cpp:338] pool8 does not need backward computation. I1106 16:38:05.390499 13539 net.cpp:338] pool7_pool7_0_split does not need backward computation. I1106 16:38:05.390501 13539 net.cpp:338] pool7 does not need backward computation. I1106 16:38:05.390504 13539 net.cpp:338] pool6_pool6_0_split does not need backward computation. I1106 16:38:05.390507 13539 net.cpp:338] pool6 does not need backward computation. I1106 16:38:05.390509 13539 net.cpp:338] res5a_branch2b_res5a_branch2b/relu_0_split does not need backward computation. I1106 16:38:05.390513 13539 net.cpp:338] res5a_branch2b/relu does not need backward computation. I1106 16:38:05.390516 13539 net.cpp:338] res5a_branch2b/bn does not need backward computation. I1106 16:38:05.390518 13539 net.cpp:338] res5a_branch2b does not need backward computation. I1106 16:38:05.390520 13539 net.cpp:338] res5a_branch2a/relu does not need backward computation. I1106 16:38:05.390523 13539 net.cpp:338] res5a_branch2a/bn does not need backward computation. I1106 16:38:05.390527 13539 net.cpp:338] res5a_branch2a does not need backward computation. I1106 16:38:05.390528 13539 net.cpp:338] pool4 does not need backward computation. I1106 16:38:05.390530 13539 net.cpp:338] res4a_branch2b_res4a_branch2b/relu_0_split does not need backward computation. I1106 16:38:05.390533 13539 net.cpp:338] res4a_branch2b/relu does not need backward computation. I1106 16:38:05.390537 13539 net.cpp:338] res4a_branch2b/bn does not need backward computation. I1106 16:38:05.390538 13539 net.cpp:338] res4a_branch2b does not need backward computation. I1106 16:38:05.390544 13539 net.cpp:338] res4a_branch2a/relu does not need backward computation. I1106 16:38:05.390547 13539 net.cpp:338] res4a_branch2a/bn does not need backward computation. I1106 16:38:05.390550 13539 net.cpp:338] res4a_branch2a does not need backward computation. I1106 16:38:05.390552 13539 net.cpp:338] pool3 does not need backward computation. I1106 16:38:05.390555 13539 net.cpp:338] res3a_branch2b/relu does not need backward computation. I1106 16:38:05.390558 13539 net.cpp:338] res3a_branch2b/bn does not need backward computation. I1106 16:38:05.390560 13539 net.cpp:338] res3a_branch2b does not need backward computation. I1106 16:38:05.390563 13539 net.cpp:338] res3a_branch2a/relu does not need backward computation. I1106 16:38:05.390565 13539 net.cpp:338] res3a_branch2a/bn does not need backward computation. I1106 16:38:05.390568 13539 net.cpp:338] res3a_branch2a does not need backward computation. I1106 16:38:05.390570 13539 net.cpp:338] pool2 does not need backward computation. I1106 16:38:05.390573 13539 net.cpp:338] res2a_branch2b/relu does not need backward computation. I1106 16:38:05.390574 13539 net.cpp:338] res2a_branch2b/bn does not need backward computation. I1106 16:38:05.390576 13539 net.cpp:338] res2a_branch2b does not need backward computation. I1106 16:38:05.390579 13539 net.cpp:338] res2a_branch2a/relu does not need backward computation. I1106 16:38:05.390581 13539 net.cpp:338] res2a_branch2a/bn does not need backward computation. I1106 16:38:05.390583 13539 net.cpp:338] res2a_branch2a does not need backward computation. I1106 16:38:05.390585 13539 net.cpp:338] pool1 does not need backward computation. I1106 16:38:05.390588 13539 net.cpp:338] conv1b/relu does not need backward computation. I1106 16:38:05.390590 13539 net.cpp:338] conv1b/bn does not need backward computation. I1106 16:38:05.390592 13539 net.cpp:338] conv1b does not need backward computation. I1106 16:38:05.390594 13539 net.cpp:338] conv1a/relu does not need backward computation. I1106 16:38:05.390596 13539 net.cpp:338] conv1a/bn does not need backward computation. I1106 16:38:05.390599 13539 net.cpp:338] conv1a does not need backward computation. I1106 16:38:05.390601 13539 net.cpp:338] data/bias does not need backward computation. I1106 16:38:05.390604 13539 net.cpp:338] data_data_0_split does not need backward computation. I1106 16:38:05.390607 13539 net.cpp:338] data does not need backward computation. I1106 16:38:05.390609 13539 net.cpp:380] This network produces output ctx_output5 I1106 16:38:05.390611 13539 net.cpp:380] This network produces output detection_eval I1106 16:38:05.390671 13539 net.cpp:403] Top memory (TEST) required for data: 1011843208 diff: 1011843208 I1106 16:38:05.390676 13539 net.cpp:406] Bottom memory (TEST) required for data: 1011794016 diff: 1011794016 I1106 16:38:05.390676 13539 net.cpp:409] Shared (in-place) memory (TEST) by data: 498106368 diff: 498106368 I1106 16:38:05.390679 13539 net.cpp:412] Parameters memory (TEST) required for data: 11946688 diff: 11946688 I1106 16:38:05.390681 13539 net.cpp:415] Parameters shared memory (TEST) by data: 0 diff: 0 I1106 16:38:05.390683 13539 net.cpp:421] Network initialization done. I1106 16:38:05.390846 13539 solver.cpp:55] Solver scaffolding done. I1106 16:38:05.393054 13539 caffe.cpp:158] Finetuning from training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/initial/ti-custom-cfg1_ssdJacintoNetV2_iter_120000.caffemodel F1106 16:38:05.393105 13539 io.cpp:55] Check failed: fd != -1 (-1 vs. -1) File not found: training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/initial/ti-custom-cfg1_ssdJacintoNetV2_iter_120000.caffemodel *** Check failure stack trace: *** @ 0x7ff182c9e5cd google::LogMessage::Fail() @ 0x7ff182ca0433 google::LogMessage::SendToLog() @ 0x7ff182c9e15b google::LogMessage::Flush() @ 0x7ff182ca0e1e google::LogMessageFatal::~LogMessageFatal() @ 0x7ff183cac6dc caffe::ReadProtoFromBinaryFile() @ 0x7ff183d24f56 caffe::ReadNetParamsFromBinaryFileOrDie() @ 0x7ff18385b88a caffe::Net::CopyTrainedLayersFromBinaryProto() @ 0x7ff18385b92e caffe::Net::CopyTrainedLayersFrom() @ 0x40f889 CopyLayers() @ 0x410616 train() @ 0x40d1f0 main @ 0x7ff181420830 __libc_start_main @ 0x40de89 _start @ (nil) (unknown) I1106 16:38:05.933686 13573 caffe.cpp:902] This is NVCaffe 0.17.0 started at Wed Nov 6 16:38:05 2019 I1106 16:38:05.933825 13573 caffe.cpp:904] CuDNN version: 7601 I1106 16:38:05.933828 13573 caffe.cpp:905] CuBLAS version: 10201 I1106 16:38:05.933830 13573 caffe.cpp:906] CUDA version: 10010 I1106 16:38:05.933831 13573 caffe.cpp:907] CUDA driver version: 10010 I1106 16:38:05.933835 13573 caffe.cpp:908] Arguments: [0]: /home/liuyuyuan/caffe-jacinto/build/tools/caffe.bin [1]: train [2]: --solver=training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/sparse/solver.prototxt [3]: --weights=training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/l1reg/ti-custom-cfg1_ssdJacintoNetV2_iter_120000.caffemodel [4]: --gpu [5]: 0 I1106 16:38:05.987710 13573 gpu_memory.cpp:105] GPUMemory::Manager initialized I1106 16:38:05.988121 13573 gpu_memory.cpp:107] Total memory: 6193479680, Free: 3152805888, dev_info[0]: total=6193479680 free=3152805888 I1106 16:38:05.988126 13573 caffe.cpp:226] Using GPUs 0 I1106 16:38:05.988363 13573 caffe.cpp:230] GPU 0: GeForce GTX 1660 Ti I1106 16:38:05.988421 13573 solver.cpp:41] Solver data type: FLOAT I1106 16:38:05.995620 13573 solver.cpp:44] Initializing solver from parameters: train_net: "training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/sparse/train.prototxt" test_net: "training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/sparse/test.prototxt" test_iter: 3 test_interval: 2000 base_lr: 0.001 display: 100 max_iter: 120000 lr_policy: "poly" gamma: 0.1 power: 4 momentum: 0.9 weight_decay: 1e-05 snapshot: 2000 snapshot_prefix: "training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/sparse/ti-custom-cfg1_ssdJacintoNetV2" solver_mode: GPU device_id: 0 random_seed: 33 debug_info: false train_state { level: 0 stage: "" } snapshot_after_train: true regularization_type: "L1" test_initialization: true average_loss: 10 stepvalue: 30000 stepvalue: 45000 stepvalue: 300000 iter_size: 32 type: "SGD" display_sparsity: 2000 sparse_mode: SPARSE_UPDATE sparsity_target: 0.7 sparsity_step_factor: 0.05 sparsity_step_iter: 2000 sparsity_start_iter: 0 sparsity_start_factor: 0.5 sparsity_threshold_maxratio: 0.2 sparsity_itr_increment_bfr_applying: true sparsity_threshold_value_max: 0.2 eval_type: "detection" ap_version: "11point" show_per_class_result: true I1106 16:38:05.995790 13573 solver.cpp:76] Creating training net from train_net file: training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/sparse/train.prototxt I1106 16:38:05.996903 13573 net.cpp:80] Initializing net from parameters: name: "ssdJacintoNetV2" state { phase: TRAIN level: 0 stage: "" } layer { name: "data" type: "AnnotatedData" top: "data" top: "label" include { phase: TRAIN } transform_param { mirror: true mean_value: 0 mean_value: 0 mean_value: 0 force_color: false resize_param { prob: 1 resize_mode: WARP height: 320 width: 768 interp_mode: LINEAR interp_mode: AREA interp_mode: NEAREST interp_mode: CUBIC interp_mode: LANCZOS4 } emit_constraint { emit_type: CENTER } crop_h: 320 crop_w: 768 distort_param { brightness_prob: 0.5 brightness_delta: 32 contrast_prob: 0.5 contrast_lower: 0.5 contrast_upper: 1.5 hue_prob: 0.5 hue_delta: 18 saturation_prob: 0.5 saturation_lower: 0.5 saturation_upper: 1.5 random_order_prob: 0 } expand_param { prob: 0.5 max_expand_ratio: 4 } } data_param { source: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_trainval_lmdb" batch_size: 1 backend: LMDB threads: 4 parser_threads: 4 } annotated_data_param { batch_sampler { max_sample: 1 max_trials: 1 } batch_sampler { sampler { min_scale: 0.3 max_scale: 1 min_aspect_ratio: 0.5 max_aspect_ratio: 2 } sample_constraint { min_jaccard_overlap: 0.1 } max_sample: 1 max_trials: 50 } batch_sampler { sampler { min_scale: 0.3 max_scale: 1 min_aspect_ratio: 0.5 max_aspect_ratio: 2 } sample_constraint { min_jaccard_overlap: 0.3 } max_sample: 1 max_trials: 50 } batch_sampler { sampler { min_scale: 0.3 max_scale: 1 min_aspect_ratio: 0.5 max_aspect_ratio: 2 } sample_constraint { min_jaccard_overlap: 0.5 } max_sample: 1 max_trials: 50 } batch_sampler { sampler { min_scale: 0.3 max_scale: 1 min_aspect_ratio: 0.5 max_aspect_ratio: 2 } sample_constraint { min_jaccard_overlap: 0.7 } max_sample: 1 max_trials: 50 } batch_sampler { sampler { min_scale: 0.3 max_scale: 1 min_aspect_ratio: 0.5 max_aspect_ratio: 2 } sample_constraint { min_jaccard_overlap: 0.9 } max_sample: 1 max_trials: 50 } batch_sampler { sampler { min_scale: 0.3 max_scale: 1 min_aspect_ratio: 0.5 max_aspect_ratio: 2 } sample_constraint { max_jaccard_overlap: 1 } max_sample: 1 max_trials: 50 } label_map_file: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/labelmap.prototxt" } } layer { name: "data/bias" type: "Bias" bottom: "data" top: "data/bias" param { lr_mult: 0 decay_mult: 0 } bias_param { filler { type: "constant" value: -128 } } } layer { name: "conv1a" type: "Convolution" bottom: "data/bias" top: "conv1a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 32 bias_term: true pad: 2 kernel_size: 5 group: 1 stride: 2 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "conv1a/bn" type: "BatchNorm" bottom: "conv1a" top: "conv1a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "conv1a/relu" type: "ReLU" bottom: "conv1a" top: "conv1a" } layer { name: "conv1b" type: "Convolution" bottom: "conv1a" top: "conv1b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 32 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "conv1b/bn" type: "BatchNorm" bottom: "conv1b" top: "conv1b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "conv1b/relu" type: "ReLU" bottom: "conv1b" top: "conv1b" } layer { name: "pool1" type: "Pooling" bottom: "conv1b" top: "pool1" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res2a_branch2a" type: "Convolution" bottom: "pool1" top: "res2a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 64 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res2a_branch2a/bn" type: "BatchNorm" bottom: "res2a_branch2a" top: "res2a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res2a_branch2a/relu" type: "ReLU" bottom: "res2a_branch2a" top: "res2a_branch2a" } layer { name: "res2a_branch2b" type: "Convolution" bottom: "res2a_branch2a" top: "res2a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 64 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res2a_branch2b/bn" type: "BatchNorm" bottom: "res2a_branch2b" top: "res2a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res2a_branch2b/relu" type: "ReLU" bottom: "res2a_branch2b" top: "res2a_branch2b" } layer { name: "pool2" type: "Pooling" bottom: "res2a_branch2b" top: "pool2" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res3a_branch2a" type: "Convolution" bottom: "pool2" top: "res3a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 128 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res3a_branch2a/bn" type: "BatchNorm" bottom: "res3a_branch2a" top: "res3a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res3a_branch2a/relu" type: "ReLU" bottom: "res3a_branch2a" top: "res3a_branch2a" } layer { name: "res3a_branch2b" type: "Convolution" bottom: "res3a_branch2a" top: "res3a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 128 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res3a_branch2b/bn" type: "BatchNorm" bottom: "res3a_branch2b" top: "res3a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res3a_branch2b/relu" type: "ReLU" bottom: "res3a_branch2b" top: "res3a_branch2b" } layer { name: "pool3" type: "Pooling" bottom: "res3a_branch2b" top: "pool3" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res4a_branch2a" type: "Convolution" bottom: "pool3" top: "res4a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res4a_branch2a/bn" type: "BatchNorm" bottom: "res4a_branch2a" top: "res4a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res4a_branch2a/relu" type: "ReLU" bottom: "res4a_branch2a" top: "res4a_branch2a" } layer { name: "res4a_branch2b" type: "Convolution" bottom: "res4a_branch2a" top: "res4a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res4a_branch2b/bn" type: "BatchNorm" bottom: "res4a_branch2b" top: "res4a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res4a_branch2b/relu" type: "ReLU" bottom: "res4a_branch2b" top: "res4a_branch2b" } layer { name: "pool4" type: "Pooling" bottom: "res4a_branch2b" top: "pool4" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res5a_branch2a" type: "Convolution" bottom: "pool4" top: "res5a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 512 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res5a_branch2a/bn" type: "BatchNorm" bottom: "res5a_branch2a" top: "res5a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res5a_branch2a/relu" type: "ReLU" bottom: "res5a_branch2a" top: "res5a_branch2a" } layer { name: "res5a_branch2b" type: "Convolution" bottom: "res5a_branch2a" top: "res5a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 512 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res5a_branch2b/bn" type: "BatchNorm" bottom: "res5a_branch2b" top: "res5a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res5a_branch2b/relu" type: "ReLU" bottom: "res5a_branch2b" top: "res5a_branch2b" } layer { name: "pool6" type: "Pooling" bottom: "res5a_branch2b" top: "pool6" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "pool7" type: "Pooling" bottom: "pool6" top: "pool7" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "pool8" type: "Pooling" bottom: "pool7" top: "pool8" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "ctx_output1" type: "Convolution" bottom: "res4a_branch2b" top: "ctx_output1" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu" type: "ReLU" bottom: "ctx_output1" top: "ctx_output1" } layer { name: "ctx_output2" type: "Convolution" bottom: "res5a_branch2b" top: "ctx_output2" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu" type: "ReLU" bottom: "ctx_output2" top: "ctx_output2" } layer { name: "ctx_output3" type: "Convolution" bottom: "pool6" top: "ctx_output3" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu" type: "ReLU" bottom: "ctx_output3" top: "ctx_output3" } layer { name: "ctx_output4" type: "Convolution" bottom: "pool7" top: "ctx_output4" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu" type: "ReLU" bottom: "ctx_output4" top: "ctx_output4" } layer { name: "ctx_output5" type: "Convolution" bottom: "pool8" top: "ctx_output5" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output5/relu" type: "ReLU" bottom: "ctx_output5" top: "ctx_output5" } layer { name: "ctx_output1/relu_mbox_loc" type: "Convolution" bottom: "ctx_output1" top: "ctx_output1/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 16 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output1/relu_mbox_loc" top: "ctx_output1/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output1/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output1/relu_mbox_loc_perm" top: "ctx_output1/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output1/relu_mbox_conf" type: "Convolution" bottom: "ctx_output1" top: "ctx_output1/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 8 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output1/relu_mbox_conf" top: "ctx_output1/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output1/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output1/relu_mbox_conf_perm" top: "ctx_output1/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output1/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output1" bottom: "data" top: "ctx_output1/relu_mbox_priorbox" prior_box_param { min_size: 14.72 max_size: 36.8 aspect_ratio: 2 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output2/relu_mbox_loc" type: "Convolution" bottom: "ctx_output2" top: "ctx_output2/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 24 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output2/relu_mbox_loc" top: "ctx_output2/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output2/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output2/relu_mbox_loc_perm" top: "ctx_output2/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output2/relu_mbox_conf" type: "Convolution" bottom: "ctx_output2" top: "ctx_output2/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 12 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output2/relu_mbox_conf" top: "ctx_output2/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output2/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output2/relu_mbox_conf_perm" top: "ctx_output2/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output2/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output2" bottom: "data" top: "ctx_output2/relu_mbox_priorbox" prior_box_param { min_size: 36.8 max_size: 132.48 aspect_ratio: 2 aspect_ratio: 3 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output3/relu_mbox_loc" type: "Convolution" bottom: "ctx_output3" top: "ctx_output3/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 24 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output3/relu_mbox_loc" top: "ctx_output3/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output3/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output3/relu_mbox_loc_perm" top: "ctx_output3/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output3/relu_mbox_conf" type: "Convolution" bottom: "ctx_output3" top: "ctx_output3/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 12 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output3/relu_mbox_conf" top: "ctx_output3/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output3/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output3/relu_mbox_conf_perm" top: "ctx_output3/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output3/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output3" bottom: "data" top: "ctx_output3/relu_mbox_priorbox" prior_box_param { min_size: 132.48 max_size: 228.16 aspect_ratio: 2 aspect_ratio: 3 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output4/relu_mbox_loc" type: "Convolution" bottom: "ctx_output4" top: "ctx_output4/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 16 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output4/relu_mbox_loc" top: "ctx_output4/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output4/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output4/relu_mbox_loc_perm" top: "ctx_output4/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output4/relu_mbox_conf" type: "Convolution" bottom: "ctx_output4" top: "ctx_output4/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 8 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output4/relu_mbox_conf" top: "ctx_output4/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output4/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output4/relu_mbox_conf_perm" top: "ctx_output4/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output4/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output4" bottom: "data" top: "ctx_output4/relu_mbox_priorbox" prior_box_param { min_size: 228.16 max_size: 323.84 aspect_ratio: 2 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "mbox_loc" type: "Concat" bottom: "ctx_output1/relu_mbox_loc_flat" bottom: "ctx_output2/relu_mbox_loc_flat" bottom: "ctx_output3/relu_mbox_loc_flat" bottom: "ctx_output4/relu_mbox_loc_flat" top: "mbox_loc" concat_param { axis: 1 } } layer { name: "mbox_conf" type: "Concat" bottom: "ctx_output1/relu_mbox_conf_flat" bottom: "ctx_output2/relu_mbox_conf_flat" bottom: "ctx_output3/relu_mbox_conf_flat" bottom: "ctx_output4/relu_mbox_conf_flat" top: "mbox_conf" concat_param { axis: 1 } } layer { name: "mbox_priorbox" type: "Concat" bottom: "ctx_output1/relu_mbox_priorbox" bottom: "ctx_output2/relu_mbox_priorbox" bottom: "ctx_output3/relu_mbox_priorbox" bottom: "ctx_output4/relu_mbox_priorbox" top: "mbox_priorbox" concat_param { axis: 2 } } layer { name: "mbox_loss" type: "MultiBoxLoss" bottom: "mbox_loc" bottom: "mbox_conf" bottom: "mbox_priorbox" bottom: "label" top: "mbox_loss" include { phase: TRAIN } propagate_down: true propagate_down: true propagate_down: false propagate_down: false loss_param { normalization: VALID } multibox_loss_param { loc_loss_type: SMOOTH_L1 conf_loss_type: SOFTMAX loc_weight: 1 num_classes: 2 share_location: true match_type: PER_PREDICTION overlap_threshold: 0.5 use_prior_for_matching: true background_label_id: 0 use_difficult_gt: true neg_pos_ratio: 3 neg_overlap: 0.5 code_type: CENTER_SIZE ignore_cross_boundary_bbox: false mining_type: MAX_NEGATIVE ignore_difficult_gt: false } } I1106 16:38:05.997175 13573 net.cpp:110] Using FLOAT as default forward math type I1106 16:38:05.997182 13573 net.cpp:116] Using FLOAT as default backward math type I1106 16:38:05.997186 13573 layer_factory.hpp:172] Creating layer 'data' of type 'AnnotatedData' I1106 16:38:05.997197 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:05.997267 13573 internal_thread.cpp:19] Starting 1 internal thread(s) on device 0 I1106 16:38:05.997740 13586 blocking_queue.cpp:40] Data layer prefetch queue empty I1106 16:38:05.998077 13573 net.cpp:200] Created Layer data (0) I1106 16:38:05.998090 13573 net.cpp:542] data -> data I1106 16:38:05.998112 13573 net.cpp:542] data -> label I1106 16:38:05.998145 13573 data_reader.cpp:58] Data Reader threads: 4, out queues: 16, depth: 1 I1106 16:38:05.998183 13573 internal_thread.cpp:19] Starting 4 internal thread(s) on device 0 I1106 16:38:05.998701 13588 db_lmdb.cpp:36] Opened lmdb /home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_trainval_lmdb I1106 16:38:06.001612 13590 db_lmdb.cpp:36] Opened lmdb /home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_trainval_lmdb I1106 16:38:06.001592 13573 annotated_data_layer.cpp:105] output data size: 1,3,320,768 I1106 16:38:06.001932 13573 annotated_data_layer.cpp:150] [0] Output data size: 1, 3, 320, 768 I1106 16:38:06.001979 13587 db_lmdb.cpp:36] Opened lmdb /home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_trainval_lmdb I1106 16:38:06.002282 13589 db_lmdb.cpp:36] Opened lmdb /home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_trainval_lmdb I1106 16:38:06.002298 13573 internal_thread.cpp:19] Starting 4 internal thread(s) on device 0 I1106 16:38:06.003851 13573 net.cpp:260] Setting up data I1106 16:38:06.003892 13573 net.cpp:267] TRAIN Top shape for layer 0 'data' 1 3 320 768 (737280) I1106 16:38:06.003906 13573 net.cpp:267] TRAIN Top shape for layer 0 'data' 1 1 5 8 (40) I1106 16:38:06.003916 13573 layer_factory.hpp:172] Creating layer 'data_data_0_split' of type 'Split' I1106 16:38:06.003927 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:06.003952 13573 net.cpp:200] Created Layer data_data_0_split (1) I1106 16:38:06.003959 13573 net.cpp:572] data_data_0_split <- data I1106 16:38:06.003983 13573 net.cpp:542] data_data_0_split -> data_data_0_split_0 I1106 16:38:06.003989 13573 net.cpp:542] data_data_0_split -> data_data_0_split_1 I1106 16:38:06.003993 13573 net.cpp:542] data_data_0_split -> data_data_0_split_2 I1106 16:38:06.003996 13573 net.cpp:542] data_data_0_split -> data_data_0_split_3 I1106 16:38:06.003999 13573 net.cpp:542] data_data_0_split -> data_data_0_split_4 I1106 16:38:06.004122 13573 net.cpp:260] Setting up data_data_0_split I1106 16:38:06.004125 13573 net.cpp:267] TRAIN Top shape for layer 1 'data_data_0_split' 1 3 320 768 (737280) I1106 16:38:06.004128 13573 net.cpp:267] TRAIN Top shape for layer 1 'data_data_0_split' 1 3 320 768 (737280) I1106 16:38:06.004132 13573 net.cpp:267] TRAIN Top shape for layer 1 'data_data_0_split' 1 3 320 768 (737280) I1106 16:38:06.004133 13573 net.cpp:267] TRAIN Top shape for layer 1 'data_data_0_split' 1 3 320 768 (737280) I1106 16:38:06.004135 13573 net.cpp:267] TRAIN Top shape for layer 1 'data_data_0_split' 1 3 320 768 (737280) I1106 16:38:06.004138 13573 layer_factory.hpp:172] Creating layer 'data/bias' of type 'Bias' I1106 16:38:06.004142 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:06.004158 13573 net.cpp:200] Created Layer data/bias (2) I1106 16:38:06.004159 13573 net.cpp:572] data/bias <- data_data_0_split_0 I1106 16:38:06.004163 13573 net.cpp:542] data/bias -> data/bias I1106 16:38:06.004429 13573 net.cpp:260] Setting up data/bias I1106 16:38:06.004436 13573 net.cpp:267] TRAIN Top shape for layer 2 'data/bias' 1 3 320 768 (737280) I1106 16:38:06.004460 13573 layer_factory.hpp:172] Creating layer 'conv1a' of type 'Convolution' I1106 16:38:06.004464 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:06.004494 13573 net.cpp:200] Created Layer conv1a (3) I1106 16:38:06.004496 13573 net.cpp:572] conv1a <- data/bias I1106 16:38:06.004498 13573 net.cpp:542] conv1a -> conv1a I1106 16:38:06.005695 13592 data_layer.cpp:105] [0] Parser threads: 4 I1106 16:38:06.005702 13592 data_layer.cpp:107] [0] Transformer threads: 4 I1106 16:38:07.198096 13573 net.cpp:260] Setting up conv1a I1106 16:38:07.198149 13573 net.cpp:267] TRAIN Top shape for layer 3 'conv1a' 1 32 160 384 (1966080) I1106 16:38:07.198161 13573 layer_factory.hpp:172] Creating layer 'conv1a/bn' of type 'BatchNorm' I1106 16:38:07.198165 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.198180 13573 net.cpp:200] Created Layer conv1a/bn (4) I1106 16:38:07.198184 13573 net.cpp:572] conv1a/bn <- conv1a I1106 16:38:07.198189 13573 net.cpp:527] conv1a/bn -> conv1a (in-place) I1106 16:38:07.198463 13573 net.cpp:260] Setting up conv1a/bn I1106 16:38:07.198469 13573 net.cpp:267] TRAIN Top shape for layer 4 'conv1a/bn' 1 32 160 384 (1966080) I1106 16:38:07.198477 13573 layer_factory.hpp:172] Creating layer 'conv1a/relu' of type 'ReLU' I1106 16:38:07.198479 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.198484 13573 net.cpp:200] Created Layer conv1a/relu (5) I1106 16:38:07.198487 13573 net.cpp:572] conv1a/relu <- conv1a I1106 16:38:07.198488 13573 net.cpp:527] conv1a/relu -> conv1a (in-place) I1106 16:38:07.198498 13573 net.cpp:260] Setting up conv1a/relu I1106 16:38:07.198500 13573 net.cpp:267] TRAIN Top shape for layer 5 'conv1a/relu' 1 32 160 384 (1966080) I1106 16:38:07.198535 13573 layer_factory.hpp:172] Creating layer 'conv1b' of type 'Convolution' I1106 16:38:07.198539 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.198547 13573 net.cpp:200] Created Layer conv1b (6) I1106 16:38:07.198550 13573 net.cpp:572] conv1b <- conv1a I1106 16:38:07.198552 13573 net.cpp:542] conv1b -> conv1b I1106 16:38:07.199275 13573 net.cpp:260] Setting up conv1b I1106 16:38:07.199282 13573 net.cpp:267] TRAIN Top shape for layer 6 'conv1b' 1 32 160 384 (1966080) I1106 16:38:07.199288 13573 layer_factory.hpp:172] Creating layer 'conv1b/bn' of type 'BatchNorm' I1106 16:38:07.199291 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.199295 13573 net.cpp:200] Created Layer conv1b/bn (7) I1106 16:38:07.199297 13573 net.cpp:572] conv1b/bn <- conv1b I1106 16:38:07.199301 13573 net.cpp:527] conv1b/bn -> conv1b (in-place) I1106 16:38:07.199580 13573 net.cpp:260] Setting up conv1b/bn I1106 16:38:07.199584 13573 net.cpp:267] TRAIN Top shape for layer 7 'conv1b/bn' 1 32 160 384 (1966080) I1106 16:38:07.199590 13573 layer_factory.hpp:172] Creating layer 'conv1b/relu' of type 'ReLU' I1106 16:38:07.199592 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.199596 13573 net.cpp:200] Created Layer conv1b/relu (8) I1106 16:38:07.199599 13573 net.cpp:572] conv1b/relu <- conv1b I1106 16:38:07.199600 13573 net.cpp:527] conv1b/relu -> conv1b (in-place) I1106 16:38:07.199604 13573 net.cpp:260] Setting up conv1b/relu I1106 16:38:07.199606 13573 net.cpp:267] TRAIN Top shape for layer 8 'conv1b/relu' 1 32 160 384 (1966080) I1106 16:38:07.199609 13573 layer_factory.hpp:172] Creating layer 'pool1' of type 'Pooling' I1106 16:38:07.199610 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.199616 13573 net.cpp:200] Created Layer pool1 (9) I1106 16:38:07.199635 13573 net.cpp:572] pool1 <- conv1b I1106 16:38:07.199638 13573 net.cpp:542] pool1 -> pool1 I1106 16:38:07.199702 13573 net.cpp:260] Setting up pool1 I1106 16:38:07.199723 13573 net.cpp:267] TRAIN Top shape for layer 9 'pool1' 1 32 80 192 (491520) I1106 16:38:07.199726 13573 layer_factory.hpp:172] Creating layer 'res2a_branch2a' of type 'Convolution' I1106 16:38:07.199728 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.199748 13573 net.cpp:200] Created Layer res2a_branch2a (10) I1106 16:38:07.199750 13573 net.cpp:572] res2a_branch2a <- pool1 I1106 16:38:07.199769 13573 net.cpp:542] res2a_branch2a -> res2a_branch2a I1106 16:38:07.200603 13573 net.cpp:260] Setting up res2a_branch2a I1106 16:38:07.200611 13573 net.cpp:267] TRAIN Top shape for layer 10 'res2a_branch2a' 1 64 80 192 (983040) I1106 16:38:07.200618 13573 layer_factory.hpp:172] Creating layer 'res2a_branch2a/bn' of type 'BatchNorm' I1106 16:38:07.200620 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.200625 13573 net.cpp:200] Created Layer res2a_branch2a/bn (11) I1106 16:38:07.200628 13573 net.cpp:572] res2a_branch2a/bn <- res2a_branch2a I1106 16:38:07.200630 13573 net.cpp:527] res2a_branch2a/bn -> res2a_branch2a (in-place) I1106 16:38:07.200882 13573 net.cpp:260] Setting up res2a_branch2a/bn I1106 16:38:07.200887 13573 net.cpp:267] TRAIN Top shape for layer 11 'res2a_branch2a/bn' 1 64 80 192 (983040) I1106 16:38:07.200893 13573 layer_factory.hpp:172] Creating layer 'res2a_branch2a/relu' of type 'ReLU' I1106 16:38:07.200896 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.200898 13573 net.cpp:200] Created Layer res2a_branch2a/relu (12) I1106 16:38:07.200901 13573 net.cpp:572] res2a_branch2a/relu <- res2a_branch2a I1106 16:38:07.200902 13573 net.cpp:527] res2a_branch2a/relu -> res2a_branch2a (in-place) I1106 16:38:07.200906 13573 net.cpp:260] Setting up res2a_branch2a/relu I1106 16:38:07.200908 13573 net.cpp:267] TRAIN Top shape for layer 12 'res2a_branch2a/relu' 1 64 80 192 (983040) I1106 16:38:07.200919 13573 layer_factory.hpp:172] Creating layer 'res2a_branch2b' of type 'Convolution' I1106 16:38:07.200938 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.200944 13573 net.cpp:200] Created Layer res2a_branch2b (13) I1106 16:38:07.200947 13573 net.cpp:572] res2a_branch2b <- res2a_branch2a I1106 16:38:07.200950 13573 net.cpp:542] res2a_branch2b -> res2a_branch2b I1106 16:38:07.201198 13573 net.cpp:260] Setting up res2a_branch2b I1106 16:38:07.201203 13573 net.cpp:267] TRAIN Top shape for layer 13 'res2a_branch2b' 1 64 80 192 (983040) I1106 16:38:07.201207 13573 layer_factory.hpp:172] Creating layer 'res2a_branch2b/bn' of type 'BatchNorm' I1106 16:38:07.201210 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.201215 13573 net.cpp:200] Created Layer res2a_branch2b/bn (14) I1106 16:38:07.201216 13573 net.cpp:572] res2a_branch2b/bn <- res2a_branch2b I1106 16:38:07.201218 13573 net.cpp:527] res2a_branch2b/bn -> res2a_branch2b (in-place) I1106 16:38:07.201506 13573 net.cpp:260] Setting up res2a_branch2b/bn I1106 16:38:07.201526 13573 net.cpp:267] TRAIN Top shape for layer 14 'res2a_branch2b/bn' 1 64 80 192 (983040) I1106 16:38:07.201531 13573 layer_factory.hpp:172] Creating layer 'res2a_branch2b/relu' of type 'ReLU' I1106 16:38:07.201534 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.201555 13573 net.cpp:200] Created Layer res2a_branch2b/relu (15) I1106 16:38:07.201556 13573 net.cpp:572] res2a_branch2b/relu <- res2a_branch2b I1106 16:38:07.201558 13573 net.cpp:527] res2a_branch2b/relu -> res2a_branch2b (in-place) I1106 16:38:07.201561 13573 net.cpp:260] Setting up res2a_branch2b/relu I1106 16:38:07.201566 13573 net.cpp:267] TRAIN Top shape for layer 15 'res2a_branch2b/relu' 1 64 80 192 (983040) I1106 16:38:07.201570 13573 layer_factory.hpp:172] Creating layer 'pool2' of type 'Pooling' I1106 16:38:07.201572 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.201576 13573 net.cpp:200] Created Layer pool2 (16) I1106 16:38:07.201580 13573 net.cpp:572] pool2 <- res2a_branch2b I1106 16:38:07.201581 13573 net.cpp:542] pool2 -> pool2 I1106 16:38:07.201609 13573 net.cpp:260] Setting up pool2 I1106 16:38:07.201612 13573 net.cpp:267] TRAIN Top shape for layer 16 'pool2' 1 64 40 96 (245760) I1106 16:38:07.201615 13573 layer_factory.hpp:172] Creating layer 'res3a_branch2a' of type 'Convolution' I1106 16:38:07.201617 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.201629 13573 net.cpp:200] Created Layer res3a_branch2a (17) I1106 16:38:07.201633 13573 net.cpp:572] res3a_branch2a <- pool2 I1106 16:38:07.201635 13573 net.cpp:542] res3a_branch2a -> res3a_branch2a I1106 16:38:07.202277 13573 net.cpp:260] Setting up res3a_branch2a I1106 16:38:07.202283 13573 net.cpp:267] TRAIN Top shape for layer 17 'res3a_branch2a' 1 128 40 96 (491520) I1106 16:38:07.202287 13573 layer_factory.hpp:172] Creating layer 'res3a_branch2a/bn' of type 'BatchNorm' I1106 16:38:07.202306 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.202311 13573 net.cpp:200] Created Layer res3a_branch2a/bn (18) I1106 16:38:07.202313 13573 net.cpp:572] res3a_branch2a/bn <- res3a_branch2a I1106 16:38:07.202316 13573 net.cpp:527] res3a_branch2a/bn -> res3a_branch2a (in-place) I1106 16:38:07.202502 13573 net.cpp:260] Setting up res3a_branch2a/bn I1106 16:38:07.202507 13573 net.cpp:267] TRAIN Top shape for layer 18 'res3a_branch2a/bn' 1 128 40 96 (491520) I1106 16:38:07.202530 13573 layer_factory.hpp:172] Creating layer 'res3a_branch2a/relu' of type 'ReLU' I1106 16:38:07.202533 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.202538 13573 net.cpp:200] Created Layer res3a_branch2a/relu (19) I1106 16:38:07.202548 13573 net.cpp:572] res3a_branch2a/relu <- res3a_branch2a I1106 16:38:07.202550 13573 net.cpp:527] res3a_branch2a/relu -> res3a_branch2a (in-place) I1106 16:38:07.202554 13573 net.cpp:260] Setting up res3a_branch2a/relu I1106 16:38:07.202558 13573 net.cpp:267] TRAIN Top shape for layer 19 'res3a_branch2a/relu' 1 128 40 96 (491520) I1106 16:38:07.202561 13573 layer_factory.hpp:172] Creating layer 'res3a_branch2b' of type 'Convolution' I1106 16:38:07.202564 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.202572 13573 net.cpp:200] Created Layer res3a_branch2b (20) I1106 16:38:07.202574 13573 net.cpp:572] res3a_branch2b <- res3a_branch2a I1106 16:38:07.202577 13573 net.cpp:542] res3a_branch2b -> res3a_branch2b I1106 16:38:07.202973 13573 net.cpp:260] Setting up res3a_branch2b I1106 16:38:07.202980 13573 net.cpp:267] TRAIN Top shape for layer 20 'res3a_branch2b' 1 128 40 96 (491520) I1106 16:38:07.202984 13573 layer_factory.hpp:172] Creating layer 'res3a_branch2b/bn' of type 'BatchNorm' I1106 16:38:07.203002 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.203007 13573 net.cpp:200] Created Layer res3a_branch2b/bn (21) I1106 16:38:07.203011 13573 net.cpp:572] res3a_branch2b/bn <- res3a_branch2b I1106 16:38:07.203013 13573 net.cpp:527] res3a_branch2b/bn -> res3a_branch2b (in-place) I1106 16:38:07.203199 13573 net.cpp:260] Setting up res3a_branch2b/bn I1106 16:38:07.203204 13573 net.cpp:267] TRAIN Top shape for layer 21 'res3a_branch2b/bn' 1 128 40 96 (491520) I1106 16:38:07.203225 13573 layer_factory.hpp:172] Creating layer 'res3a_branch2b/relu' of type 'ReLU' I1106 16:38:07.203228 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.203233 13573 net.cpp:200] Created Layer res3a_branch2b/relu (22) I1106 16:38:07.203235 13573 net.cpp:572] res3a_branch2b/relu <- res3a_branch2b I1106 16:38:07.203238 13573 net.cpp:527] res3a_branch2b/relu -> res3a_branch2b (in-place) I1106 16:38:07.203241 13573 net.cpp:260] Setting up res3a_branch2b/relu I1106 16:38:07.203246 13573 net.cpp:267] TRAIN Top shape for layer 22 'res3a_branch2b/relu' 1 128 40 96 (491520) I1106 16:38:07.203248 13573 layer_factory.hpp:172] Creating layer 'pool3' of type 'Pooling' I1106 16:38:07.203250 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.203254 13573 net.cpp:200] Created Layer pool3 (23) I1106 16:38:07.203258 13573 net.cpp:572] pool3 <- res3a_branch2b I1106 16:38:07.203259 13573 net.cpp:542] pool3 -> pool3 I1106 16:38:07.203289 13573 net.cpp:260] Setting up pool3 I1106 16:38:07.203292 13573 net.cpp:267] TRAIN Top shape for layer 23 'pool3' 1 128 20 48 (122880) I1106 16:38:07.203296 13573 layer_factory.hpp:172] Creating layer 'res4a_branch2a' of type 'Convolution' I1106 16:38:07.203299 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.203308 13573 net.cpp:200] Created Layer res4a_branch2a (24) I1106 16:38:07.203311 13573 net.cpp:572] res4a_branch2a <- pool3 I1106 16:38:07.203315 13573 net.cpp:542] res4a_branch2a -> res4a_branch2a I1106 16:38:07.205920 13573 net.cpp:260] Setting up res4a_branch2a I1106 16:38:07.205932 13573 net.cpp:267] TRAIN Top shape for layer 24 'res4a_branch2a' 1 256 20 48 (245760) I1106 16:38:07.205953 13573 layer_factory.hpp:172] Creating layer 'res4a_branch2a/bn' of type 'BatchNorm' I1106 16:38:07.205957 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.205965 13573 net.cpp:200] Created Layer res4a_branch2a/bn (25) I1106 16:38:07.205968 13573 net.cpp:572] res4a_branch2a/bn <- res4a_branch2a I1106 16:38:07.205971 13573 net.cpp:527] res4a_branch2a/bn -> res4a_branch2a (in-place) I1106 16:38:07.206190 13573 net.cpp:260] Setting up res4a_branch2a/bn I1106 16:38:07.206197 13573 net.cpp:267] TRAIN Top shape for layer 25 'res4a_branch2a/bn' 1 256 20 48 (245760) I1106 16:38:07.206230 13573 layer_factory.hpp:172] Creating layer 'res4a_branch2a/relu' of type 'ReLU' I1106 16:38:07.206234 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.206239 13573 net.cpp:200] Created Layer res4a_branch2a/relu (26) I1106 16:38:07.206241 13573 net.cpp:572] res4a_branch2a/relu <- res4a_branch2a I1106 16:38:07.206244 13573 net.cpp:527] res4a_branch2a/relu -> res4a_branch2a (in-place) I1106 16:38:07.206248 13573 net.cpp:260] Setting up res4a_branch2a/relu I1106 16:38:07.206254 13573 net.cpp:267] TRAIN Top shape for layer 26 'res4a_branch2a/relu' 1 256 20 48 (245760) I1106 16:38:07.206256 13573 layer_factory.hpp:172] Creating layer 'res4a_branch2b' of type 'Convolution' I1106 16:38:07.206257 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.206267 13573 net.cpp:200] Created Layer res4a_branch2b (27) I1106 16:38:07.206270 13573 net.cpp:572] res4a_branch2b <- res4a_branch2a I1106 16:38:07.206274 13573 net.cpp:542] res4a_branch2b -> res4a_branch2b I1106 16:38:07.207459 13573 net.cpp:260] Setting up res4a_branch2b I1106 16:38:07.207478 13573 net.cpp:267] TRAIN Top shape for layer 27 'res4a_branch2b' 1 256 20 48 (245760) I1106 16:38:07.207489 13573 layer_factory.hpp:172] Creating layer 'res4a_branch2b/bn' of type 'BatchNorm' I1106 16:38:07.207495 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.207504 13573 net.cpp:200] Created Layer res4a_branch2b/bn (28) I1106 16:38:07.207511 13573 net.cpp:572] res4a_branch2b/bn <- res4a_branch2b I1106 16:38:07.207518 13573 net.cpp:527] res4a_branch2b/bn -> res4a_branch2b (in-place) I1106 16:38:07.207718 13573 net.cpp:260] Setting up res4a_branch2b/bn I1106 16:38:07.207731 13573 net.cpp:267] TRAIN Top shape for layer 28 'res4a_branch2b/bn' 1 256 20 48 (245760) I1106 16:38:07.207741 13573 layer_factory.hpp:172] Creating layer 'res4a_branch2b/relu' of type 'ReLU' I1106 16:38:07.207748 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.207757 13573 net.cpp:200] Created Layer res4a_branch2b/relu (29) I1106 16:38:07.207763 13573 net.cpp:572] res4a_branch2b/relu <- res4a_branch2b I1106 16:38:07.207770 13573 net.cpp:527] res4a_branch2b/relu -> res4a_branch2b (in-place) I1106 16:38:07.207778 13573 net.cpp:260] Setting up res4a_branch2b/relu I1106 16:38:07.207785 13573 net.cpp:267] TRAIN Top shape for layer 29 'res4a_branch2b/relu' 1 256 20 48 (245760) I1106 16:38:07.207792 13573 layer_factory.hpp:172] Creating layer 'res4a_branch2b_res4a_branch2b/relu_0_split' of type 'Split' I1106 16:38:07.207798 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.207806 13573 net.cpp:200] Created Layer res4a_branch2b_res4a_branch2b/relu_0_split (30) I1106 16:38:07.207813 13573 net.cpp:572] res4a_branch2b_res4a_branch2b/relu_0_split <- res4a_branch2b I1106 16:38:07.207819 13573 net.cpp:542] res4a_branch2b_res4a_branch2b/relu_0_split -> res4a_branch2b_res4a_branch2b/relu_0_split_0 I1106 16:38:07.207828 13573 net.cpp:542] res4a_branch2b_res4a_branch2b/relu_0_split -> res4a_branch2b_res4a_branch2b/relu_0_split_1 I1106 16:38:07.207855 13573 net.cpp:260] Setting up res4a_branch2b_res4a_branch2b/relu_0_split I1106 16:38:07.207865 13573 net.cpp:267] TRAIN Top shape for layer 30 'res4a_branch2b_res4a_branch2b/relu_0_split' 1 256 20 48 (245760) I1106 16:38:07.207872 13573 net.cpp:267] TRAIN Top shape for layer 30 'res4a_branch2b_res4a_branch2b/relu_0_split' 1 256 20 48 (245760) I1106 16:38:07.207878 13573 layer_factory.hpp:172] Creating layer 'pool4' of type 'Pooling' I1106 16:38:07.207885 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.207895 13573 net.cpp:200] Created Layer pool4 (31) I1106 16:38:07.207902 13573 net.cpp:572] pool4 <- res4a_branch2b_res4a_branch2b/relu_0_split_0 I1106 16:38:07.207912 13573 net.cpp:542] pool4 -> pool4 I1106 16:38:07.207952 13573 net.cpp:260] Setting up pool4 I1106 16:38:07.207962 13573 net.cpp:267] TRAIN Top shape for layer 31 'pool4' 1 256 10 24 (61440) I1106 16:38:07.207969 13573 layer_factory.hpp:172] Creating layer 'res5a_branch2a' of type 'Convolution' I1106 16:38:07.207975 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.207988 13573 net.cpp:200] Created Layer res5a_branch2a (32) I1106 16:38:07.207996 13573 net.cpp:572] res5a_branch2a <- pool4 I1106 16:38:07.208003 13573 net.cpp:542] res5a_branch2a -> res5a_branch2a I1106 16:38:07.218127 13573 net.cpp:260] Setting up res5a_branch2a I1106 16:38:07.218340 13573 net.cpp:267] TRAIN Top shape for layer 32 'res5a_branch2a' 1 512 10 24 (122880) I1106 16:38:07.218380 13573 layer_factory.hpp:172] Creating layer 'res5a_branch2a/bn' of type 'BatchNorm' I1106 16:38:07.218400 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.218444 13573 net.cpp:200] Created Layer res5a_branch2a/bn (33) I1106 16:38:07.218454 13573 net.cpp:572] res5a_branch2a/bn <- res5a_branch2a I1106 16:38:07.218466 13573 net.cpp:527] res5a_branch2a/bn -> res5a_branch2a (in-place) I1106 16:38:07.218773 13573 net.cpp:260] Setting up res5a_branch2a/bn I1106 16:38:07.218787 13573 net.cpp:267] TRAIN Top shape for layer 33 'res5a_branch2a/bn' 1 512 10 24 (122880) I1106 16:38:07.218799 13573 layer_factory.hpp:172] Creating layer 'res5a_branch2a/relu' of type 'ReLU' I1106 16:38:07.218809 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.218825 13573 net.cpp:200] Created Layer res5a_branch2a/relu (34) I1106 16:38:07.218832 13573 net.cpp:572] res5a_branch2a/relu <- res5a_branch2a I1106 16:38:07.218839 13573 net.cpp:527] res5a_branch2a/relu -> res5a_branch2a (in-place) I1106 16:38:07.218852 13573 net.cpp:260] Setting up res5a_branch2a/relu I1106 16:38:07.218859 13573 net.cpp:267] TRAIN Top shape for layer 34 'res5a_branch2a/relu' 1 512 10 24 (122880) I1106 16:38:07.218866 13573 layer_factory.hpp:172] Creating layer 'res5a_branch2b' of type 'Convolution' I1106 16:38:07.218875 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.218930 13573 net.cpp:200] Created Layer res5a_branch2b (35) I1106 16:38:07.218938 13573 net.cpp:572] res5a_branch2b <- res5a_branch2a I1106 16:38:07.218945 13573 net.cpp:542] res5a_branch2b -> res5a_branch2b I1106 16:38:07.224325 13573 net.cpp:260] Setting up res5a_branch2b I1106 16:38:07.224345 13573 net.cpp:267] TRAIN Top shape for layer 35 'res5a_branch2b' 1 512 10 24 (122880) I1106 16:38:07.224356 13573 layer_factory.hpp:172] Creating layer 'res5a_branch2b/bn' of type 'BatchNorm' I1106 16:38:07.224360 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.224370 13573 net.cpp:200] Created Layer res5a_branch2b/bn (36) I1106 16:38:07.224375 13573 net.cpp:572] res5a_branch2b/bn <- res5a_branch2b I1106 16:38:07.224378 13573 net.cpp:527] res5a_branch2b/bn -> res5a_branch2b (in-place) I1106 16:38:07.224583 13573 net.cpp:260] Setting up res5a_branch2b/bn I1106 16:38:07.224597 13573 net.cpp:267] TRAIN Top shape for layer 36 'res5a_branch2b/bn' 1 512 10 24 (122880) I1106 16:38:07.224607 13573 layer_factory.hpp:172] Creating layer 'res5a_branch2b/relu' of type 'ReLU' I1106 16:38:07.224615 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.224623 13573 net.cpp:200] Created Layer res5a_branch2b/relu (37) I1106 16:38:07.224628 13573 net.cpp:572] res5a_branch2b/relu <- res5a_branch2b I1106 16:38:07.224633 13573 net.cpp:527] res5a_branch2b/relu -> res5a_branch2b (in-place) I1106 16:38:07.224642 13573 net.cpp:260] Setting up res5a_branch2b/relu I1106 16:38:07.224649 13573 net.cpp:267] TRAIN Top shape for layer 37 'res5a_branch2b/relu' 1 512 10 24 (122880) I1106 16:38:07.224668 13573 layer_factory.hpp:172] Creating layer 'res5a_branch2b_res5a_branch2b/relu_0_split' of type 'Split' I1106 16:38:07.224689 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.224696 13573 net.cpp:200] Created Layer res5a_branch2b_res5a_branch2b/relu_0_split (38) I1106 16:38:07.224704 13573 net.cpp:572] res5a_branch2b_res5a_branch2b/relu_0_split <- res5a_branch2b I1106 16:38:07.224711 13573 net.cpp:542] res5a_branch2b_res5a_branch2b/relu_0_split -> res5a_branch2b_res5a_branch2b/relu_0_split_0 I1106 16:38:07.224720 13573 net.cpp:542] res5a_branch2b_res5a_branch2b/relu_0_split -> res5a_branch2b_res5a_branch2b/relu_0_split_1 I1106 16:38:07.224745 13573 net.cpp:260] Setting up res5a_branch2b_res5a_branch2b/relu_0_split I1106 16:38:07.224750 13573 net.cpp:267] TRAIN Top shape for layer 38 'res5a_branch2b_res5a_branch2b/relu_0_split' 1 512 10 24 (122880) I1106 16:38:07.224753 13573 net.cpp:267] TRAIN Top shape for layer 38 'res5a_branch2b_res5a_branch2b/relu_0_split' 1 512 10 24 (122880) I1106 16:38:07.224756 13573 layer_factory.hpp:172] Creating layer 'pool6' of type 'Pooling' I1106 16:38:07.224758 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.224768 13573 net.cpp:200] Created Layer pool6 (39) I1106 16:38:07.224771 13573 net.cpp:572] pool6 <- res5a_branch2b_res5a_branch2b/relu_0_split_0 I1106 16:38:07.224773 13573 net.cpp:542] pool6 -> pool6 I1106 16:38:07.224807 13573 net.cpp:260] Setting up pool6 I1106 16:38:07.224817 13573 net.cpp:267] TRAIN Top shape for layer 39 'pool6' 1 512 5 12 (30720) I1106 16:38:07.224823 13573 layer_factory.hpp:172] Creating layer 'pool6_pool6_0_split' of type 'Split' I1106 16:38:07.224828 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.224834 13573 net.cpp:200] Created Layer pool6_pool6_0_split (40) I1106 16:38:07.224840 13573 net.cpp:572] pool6_pool6_0_split <- pool6 I1106 16:38:07.224846 13573 net.cpp:542] pool6_pool6_0_split -> pool6_pool6_0_split_0 I1106 16:38:07.224853 13573 net.cpp:542] pool6_pool6_0_split -> pool6_pool6_0_split_1 I1106 16:38:07.224879 13573 net.cpp:260] Setting up pool6_pool6_0_split I1106 16:38:07.224882 13573 net.cpp:267] TRAIN Top shape for layer 40 'pool6_pool6_0_split' 1 512 5 12 (30720) I1106 16:38:07.224885 13573 net.cpp:267] TRAIN Top shape for layer 40 'pool6_pool6_0_split' 1 512 5 12 (30720) I1106 16:38:07.224887 13573 layer_factory.hpp:172] Creating layer 'pool7' of type 'Pooling' I1106 16:38:07.224889 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.224894 13573 net.cpp:200] Created Layer pool7 (41) I1106 16:38:07.224896 13573 net.cpp:572] pool7 <- pool6_pool6_0_split_0 I1106 16:38:07.224898 13573 net.cpp:542] pool7 -> pool7 I1106 16:38:07.224928 13573 net.cpp:260] Setting up pool7 I1106 16:38:07.224938 13573 net.cpp:267] TRAIN Top shape for layer 41 'pool7' 1 512 3 6 (9216) I1106 16:38:07.224942 13573 layer_factory.hpp:172] Creating layer 'pool7_pool7_0_split' of type 'Split' I1106 16:38:07.224944 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.224946 13573 net.cpp:200] Created Layer pool7_pool7_0_split (42) I1106 16:38:07.224948 13573 net.cpp:572] pool7_pool7_0_split <- pool7 I1106 16:38:07.224951 13573 net.cpp:542] pool7_pool7_0_split -> pool7_pool7_0_split_0 I1106 16:38:07.224959 13573 net.cpp:542] pool7_pool7_0_split -> pool7_pool7_0_split_1 I1106 16:38:07.224979 13573 net.cpp:260] Setting up pool7_pool7_0_split I1106 16:38:07.224983 13573 net.cpp:267] TRAIN Top shape for layer 42 'pool7_pool7_0_split' 1 512 3 6 (9216) I1106 16:38:07.224992 13573 net.cpp:267] TRAIN Top shape for layer 42 'pool7_pool7_0_split' 1 512 3 6 (9216) I1106 16:38:07.224995 13573 layer_factory.hpp:172] Creating layer 'pool8' of type 'Pooling' I1106 16:38:07.224997 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.225003 13573 net.cpp:200] Created Layer pool8 (43) I1106 16:38:07.225009 13573 net.cpp:572] pool8 <- pool7_pool7_0_split_0 I1106 16:38:07.225018 13573 net.cpp:542] pool8 -> pool8 I1106 16:38:07.225047 13573 net.cpp:260] Setting up pool8 I1106 16:38:07.225051 13573 net.cpp:267] TRAIN Top shape for layer 43 'pool8' 1 512 2 3 (3072) I1106 16:38:07.225060 13573 layer_factory.hpp:172] Creating layer 'ctx_output1' of type 'Convolution' I1106 16:38:07.225064 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.225072 13573 net.cpp:200] Created Layer ctx_output1 (44) I1106 16:38:07.225076 13573 net.cpp:572] ctx_output1 <- res4a_branch2b_res4a_branch2b/relu_0_split_1 I1106 16:38:07.225085 13573 net.cpp:542] ctx_output1 -> ctx_output1 I1106 16:38:07.225709 13573 net.cpp:260] Setting up ctx_output1 I1106 16:38:07.225716 13573 net.cpp:267] TRAIN Top shape for layer 44 'ctx_output1' 1 256 20 48 (245760) I1106 16:38:07.225720 13573 layer_factory.hpp:172] Creating layer 'ctx_output1/relu' of type 'ReLU' I1106 16:38:07.225723 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.225733 13573 net.cpp:200] Created Layer ctx_output1/relu (45) I1106 16:38:07.225740 13573 net.cpp:572] ctx_output1/relu <- ctx_output1 I1106 16:38:07.225744 13573 net.cpp:527] ctx_output1/relu -> ctx_output1 (in-place) I1106 16:38:07.225749 13573 net.cpp:260] Setting up ctx_output1/relu I1106 16:38:07.225756 13573 net.cpp:267] TRAIN Top shape for layer 45 'ctx_output1/relu' 1 256 20 48 (245760) I1106 16:38:07.225760 13573 layer_factory.hpp:172] Creating layer 'ctx_output1_ctx_output1/relu_0_split' of type 'Split' I1106 16:38:07.225762 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.225766 13573 net.cpp:200] Created Layer ctx_output1_ctx_output1/relu_0_split (46) I1106 16:38:07.225769 13573 net.cpp:572] ctx_output1_ctx_output1/relu_0_split <- ctx_output1 I1106 16:38:07.225770 13573 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_0 I1106 16:38:07.225778 13573 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_1 I1106 16:38:07.225782 13573 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_2 I1106 16:38:07.225814 13573 net.cpp:260] Setting up ctx_output1_ctx_output1/relu_0_split I1106 16:38:07.225818 13573 net.cpp:267] TRAIN Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 1 256 20 48 (245760) I1106 16:38:07.225821 13573 net.cpp:267] TRAIN Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 1 256 20 48 (245760) I1106 16:38:07.225824 13573 net.cpp:267] TRAIN Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 1 256 20 48 (245760) I1106 16:38:07.225826 13573 layer_factory.hpp:172] Creating layer 'ctx_output2' of type 'Convolution' I1106 16:38:07.225828 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.225838 13573 net.cpp:200] Created Layer ctx_output2 (47) I1106 16:38:07.225841 13573 net.cpp:572] ctx_output2 <- res5a_branch2b_res5a_branch2b/relu_0_split_1 I1106 16:38:07.225844 13573 net.cpp:542] ctx_output2 -> ctx_output2 I1106 16:38:07.226903 13573 net.cpp:260] Setting up ctx_output2 I1106 16:38:07.226910 13573 net.cpp:267] TRAIN Top shape for layer 47 'ctx_output2' 1 256 10 24 (61440) I1106 16:38:07.226914 13573 layer_factory.hpp:172] Creating layer 'ctx_output2/relu' of type 'ReLU' I1106 16:38:07.226918 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.226922 13573 net.cpp:200] Created Layer ctx_output2/relu (48) I1106 16:38:07.226925 13573 net.cpp:572] ctx_output2/relu <- ctx_output2 I1106 16:38:07.226927 13573 net.cpp:527] ctx_output2/relu -> ctx_output2 (in-place) I1106 16:38:07.226933 13573 net.cpp:260] Setting up ctx_output2/relu I1106 16:38:07.226935 13573 net.cpp:267] TRAIN Top shape for layer 48 'ctx_output2/relu' 1 256 10 24 (61440) I1106 16:38:07.226938 13573 layer_factory.hpp:172] Creating layer 'ctx_output2_ctx_output2/relu_0_split' of type 'Split' I1106 16:38:07.226953 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.226963 13573 net.cpp:200] Created Layer ctx_output2_ctx_output2/relu_0_split (49) I1106 16:38:07.226970 13573 net.cpp:572] ctx_output2_ctx_output2/relu_0_split <- ctx_output2 I1106 16:38:07.226974 13573 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_0 I1106 16:38:07.226977 13573 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_1 I1106 16:38:07.226986 13573 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_2 I1106 16:38:07.227018 13573 net.cpp:260] Setting up ctx_output2_ctx_output2/relu_0_split I1106 16:38:07.227022 13573 net.cpp:267] TRAIN Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 1 256 10 24 (61440) I1106 16:38:07.227025 13573 net.cpp:267] TRAIN Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 1 256 10 24 (61440) I1106 16:38:07.227027 13573 net.cpp:267] TRAIN Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 1 256 10 24 (61440) I1106 16:38:07.227030 13573 layer_factory.hpp:172] Creating layer 'ctx_output3' of type 'Convolution' I1106 16:38:07.227031 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.227041 13573 net.cpp:200] Created Layer ctx_output3 (50) I1106 16:38:07.227044 13573 net.cpp:572] ctx_output3 <- pool6_pool6_0_split_1 I1106 16:38:07.227046 13573 net.cpp:542] ctx_output3 -> ctx_output3 I1106 16:38:07.228790 13573 net.cpp:260] Setting up ctx_output3 I1106 16:38:07.228806 13573 net.cpp:267] TRAIN Top shape for layer 50 'ctx_output3' 1 256 5 12 (15360) I1106 16:38:07.228814 13573 layer_factory.hpp:172] Creating layer 'ctx_output3/relu' of type 'ReLU' I1106 16:38:07.228826 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.228830 13573 net.cpp:200] Created Layer ctx_output3/relu (51) I1106 16:38:07.228833 13573 net.cpp:572] ctx_output3/relu <- ctx_output3 I1106 16:38:07.228837 13573 net.cpp:527] ctx_output3/relu -> ctx_output3 (in-place) I1106 16:38:07.228847 13573 net.cpp:260] Setting up ctx_output3/relu I1106 16:38:07.228854 13573 net.cpp:267] TRAIN Top shape for layer 51 'ctx_output3/relu' 1 256 5 12 (15360) I1106 16:38:07.228860 13573 layer_factory.hpp:172] Creating layer 'ctx_output3_ctx_output3/relu_0_split' of type 'Split' I1106 16:38:07.228865 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.228873 13573 net.cpp:200] Created Layer ctx_output3_ctx_output3/relu_0_split (52) I1106 16:38:07.228879 13573 net.cpp:572] ctx_output3_ctx_output3/relu_0_split <- ctx_output3 I1106 16:38:07.228885 13573 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_0 I1106 16:38:07.228893 13573 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_1 I1106 16:38:07.228900 13573 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_2 I1106 16:38:07.228943 13573 net.cpp:260] Setting up ctx_output3_ctx_output3/relu_0_split I1106 16:38:07.228948 13573 net.cpp:267] TRAIN Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 1 256 5 12 (15360) I1106 16:38:07.228951 13573 net.cpp:267] TRAIN Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 1 256 5 12 (15360) I1106 16:38:07.228953 13573 net.cpp:267] TRAIN Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 1 256 5 12 (15360) I1106 16:38:07.228955 13573 layer_factory.hpp:172] Creating layer 'ctx_output4' of type 'Convolution' I1106 16:38:07.228958 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.228967 13573 net.cpp:200] Created Layer ctx_output4 (53) I1106 16:38:07.228969 13573 net.cpp:572] ctx_output4 <- pool7_pool7_0_split_1 I1106 16:38:07.228973 13573 net.cpp:542] ctx_output4 -> ctx_output4 I1106 16:38:07.230093 13573 net.cpp:260] Setting up ctx_output4 I1106 16:38:07.230108 13573 net.cpp:267] TRAIN Top shape for layer 53 'ctx_output4' 1 256 3 6 (4608) I1106 16:38:07.230115 13573 layer_factory.hpp:172] Creating layer 'ctx_output4/relu' of type 'ReLU' I1106 16:38:07.230119 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.230125 13573 net.cpp:200] Created Layer ctx_output4/relu (54) I1106 16:38:07.230129 13573 net.cpp:572] ctx_output4/relu <- ctx_output4 I1106 16:38:07.230134 13573 net.cpp:527] ctx_output4/relu -> ctx_output4 (in-place) I1106 16:38:07.230139 13573 net.cpp:260] Setting up ctx_output4/relu I1106 16:38:07.230144 13573 net.cpp:267] TRAIN Top shape for layer 54 'ctx_output4/relu' 1 256 3 6 (4608) I1106 16:38:07.230146 13573 layer_factory.hpp:172] Creating layer 'ctx_output4_ctx_output4/relu_0_split' of type 'Split' I1106 16:38:07.230149 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.230154 13573 net.cpp:200] Created Layer ctx_output4_ctx_output4/relu_0_split (55) I1106 16:38:07.230156 13573 net.cpp:572] ctx_output4_ctx_output4/relu_0_split <- ctx_output4 I1106 16:38:07.230159 13573 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_0 I1106 16:38:07.230165 13573 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_1 I1106 16:38:07.230167 13573 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_2 I1106 16:38:07.230211 13573 net.cpp:260] Setting up ctx_output4_ctx_output4/relu_0_split I1106 16:38:07.230216 13573 net.cpp:267] TRAIN Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 1 256 3 6 (4608) I1106 16:38:07.230218 13573 net.cpp:267] TRAIN Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 1 256 3 6 (4608) I1106 16:38:07.230221 13573 net.cpp:267] TRAIN Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 1 256 3 6 (4608) I1106 16:38:07.230224 13573 layer_factory.hpp:172] Creating layer 'ctx_output5' of type 'Convolution' I1106 16:38:07.230228 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.230240 13573 net.cpp:200] Created Layer ctx_output5 (56) I1106 16:38:07.230244 13573 net.cpp:572] ctx_output5 <- pool8 I1106 16:38:07.230248 13573 net.cpp:542] ctx_output5 -> ctx_output5 I1106 16:38:07.231377 13573 net.cpp:260] Setting up ctx_output5 I1106 16:38:07.231385 13573 net.cpp:267] TRAIN Top shape for layer 56 'ctx_output5' 1 256 2 3 (1536) I1106 16:38:07.231390 13573 layer_factory.hpp:172] Creating layer 'ctx_output5/relu' of type 'ReLU' I1106 16:38:07.231393 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.231397 13573 net.cpp:200] Created Layer ctx_output5/relu (57) I1106 16:38:07.231400 13573 net.cpp:572] ctx_output5/relu <- ctx_output5 I1106 16:38:07.231403 13573 net.cpp:527] ctx_output5/relu -> ctx_output5 (in-place) I1106 16:38:07.231420 13573 net.cpp:260] Setting up ctx_output5/relu I1106 16:38:07.231425 13573 net.cpp:267] TRAIN Top shape for layer 57 'ctx_output5/relu' 1 256 2 3 (1536) I1106 16:38:07.231427 13573 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc' of type 'Convolution' I1106 16:38:07.231431 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.231441 13573 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc (58) I1106 16:38:07.231443 13573 net.cpp:572] ctx_output1/relu_mbox_loc <- ctx_output1_ctx_output1/relu_0_split_0 I1106 16:38:07.231447 13573 net.cpp:542] ctx_output1/relu_mbox_loc -> ctx_output1/relu_mbox_loc I1106 16:38:07.231636 13573 net.cpp:260] Setting up ctx_output1/relu_mbox_loc I1106 16:38:07.231642 13573 net.cpp:267] TRAIN Top shape for layer 58 'ctx_output1/relu_mbox_loc' 1 16 20 48 (15360) I1106 16:38:07.231647 13573 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:07.231663 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.231674 13573 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc_perm (59) I1106 16:38:07.231678 13573 net.cpp:572] ctx_output1/relu_mbox_loc_perm <- ctx_output1/relu_mbox_loc I1106 16:38:07.231693 13573 net.cpp:542] ctx_output1/relu_mbox_loc_perm -> ctx_output1/relu_mbox_loc_perm I1106 16:38:07.231765 13573 net.cpp:260] Setting up ctx_output1/relu_mbox_loc_perm I1106 16:38:07.231770 13573 net.cpp:267] TRAIN Top shape for layer 59 'ctx_output1/relu_mbox_loc_perm' 1 20 48 16 (15360) I1106 16:38:07.231773 13573 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:07.231777 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.231783 13573 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc_flat (60) I1106 16:38:07.231786 13573 net.cpp:572] ctx_output1/relu_mbox_loc_flat <- ctx_output1/relu_mbox_loc_perm I1106 16:38:07.231789 13573 net.cpp:542] ctx_output1/relu_mbox_loc_flat -> ctx_output1/relu_mbox_loc_flat I1106 16:38:07.231848 13573 net.cpp:260] Setting up ctx_output1/relu_mbox_loc_flat I1106 16:38:07.231854 13573 net.cpp:267] TRAIN Top shape for layer 60 'ctx_output1/relu_mbox_loc_flat' 1 15360 (15360) I1106 16:38:07.231858 13573 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf' of type 'Convolution' I1106 16:38:07.231860 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.231870 13573 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf (61) I1106 16:38:07.231874 13573 net.cpp:572] ctx_output1/relu_mbox_conf <- ctx_output1_ctx_output1/relu_0_split_1 I1106 16:38:07.231878 13573 net.cpp:542] ctx_output1/relu_mbox_conf -> ctx_output1/relu_mbox_conf I1106 16:38:07.232039 13573 net.cpp:260] Setting up ctx_output1/relu_mbox_conf I1106 16:38:07.232045 13573 net.cpp:267] TRAIN Top shape for layer 61 'ctx_output1/relu_mbox_conf' 1 8 20 48 (7680) I1106 16:38:07.232050 13573 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:07.232053 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.232059 13573 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf_perm (62) I1106 16:38:07.232061 13573 net.cpp:572] ctx_output1/relu_mbox_conf_perm <- ctx_output1/relu_mbox_conf I1106 16:38:07.232065 13573 net.cpp:542] ctx_output1/relu_mbox_conf_perm -> ctx_output1/relu_mbox_conf_perm I1106 16:38:07.232122 13573 net.cpp:260] Setting up ctx_output1/relu_mbox_conf_perm I1106 16:38:07.232128 13573 net.cpp:267] TRAIN Top shape for layer 62 'ctx_output1/relu_mbox_conf_perm' 1 20 48 8 (7680) I1106 16:38:07.232131 13573 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:07.232134 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.232137 13573 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf_flat (63) I1106 16:38:07.232139 13573 net.cpp:572] ctx_output1/relu_mbox_conf_flat <- ctx_output1/relu_mbox_conf_perm I1106 16:38:07.232142 13573 net.cpp:542] ctx_output1/relu_mbox_conf_flat -> ctx_output1/relu_mbox_conf_flat I1106 16:38:07.232179 13573 net.cpp:260] Setting up ctx_output1/relu_mbox_conf_flat I1106 16:38:07.232184 13573 net.cpp:267] TRAIN Top shape for layer 63 'ctx_output1/relu_mbox_conf_flat' 1 7680 (7680) I1106 16:38:07.232187 13573 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:07.232190 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.232203 13573 net.cpp:200] Created Layer ctx_output1/relu_mbox_priorbox (64) I1106 16:38:07.232205 13573 net.cpp:572] ctx_output1/relu_mbox_priorbox <- ctx_output1_ctx_output1/relu_0_split_2 I1106 16:38:07.232208 13573 net.cpp:572] ctx_output1/relu_mbox_priorbox <- data_data_0_split_1 I1106 16:38:07.232219 13573 net.cpp:542] ctx_output1/relu_mbox_priorbox -> ctx_output1/relu_mbox_priorbox I1106 16:38:07.232236 13573 net.cpp:260] Setting up ctx_output1/relu_mbox_priorbox I1106 16:38:07.232240 13573 net.cpp:267] TRAIN Top shape for layer 64 'ctx_output1/relu_mbox_priorbox' 1 2 15360 (30720) I1106 16:38:07.232244 13573 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc' of type 'Convolution' I1106 16:38:07.232246 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.232259 13573 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc (65) I1106 16:38:07.232261 13573 net.cpp:572] ctx_output2/relu_mbox_loc <- ctx_output2_ctx_output2/relu_0_split_0 I1106 16:38:07.232265 13573 net.cpp:542] ctx_output2/relu_mbox_loc -> ctx_output2/relu_mbox_loc I1106 16:38:07.232477 13573 net.cpp:260] Setting up ctx_output2/relu_mbox_loc I1106 16:38:07.232483 13573 net.cpp:267] TRAIN Top shape for layer 65 'ctx_output2/relu_mbox_loc' 1 24 10 24 (5760) I1106 16:38:07.232488 13573 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:07.232492 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.232498 13573 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc_perm (66) I1106 16:38:07.232501 13573 net.cpp:572] ctx_output2/relu_mbox_loc_perm <- ctx_output2/relu_mbox_loc I1106 16:38:07.232504 13573 net.cpp:542] ctx_output2/relu_mbox_loc_perm -> ctx_output2/relu_mbox_loc_perm I1106 16:38:07.232564 13573 net.cpp:260] Setting up ctx_output2/relu_mbox_loc_perm I1106 16:38:07.232570 13573 net.cpp:267] TRAIN Top shape for layer 66 'ctx_output2/relu_mbox_loc_perm' 1 10 24 24 (5760) I1106 16:38:07.232573 13573 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:07.232576 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.232579 13573 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc_flat (67) I1106 16:38:07.232581 13573 net.cpp:572] ctx_output2/relu_mbox_loc_flat <- ctx_output2/relu_mbox_loc_perm I1106 16:38:07.232584 13573 net.cpp:542] ctx_output2/relu_mbox_loc_flat -> ctx_output2/relu_mbox_loc_flat I1106 16:38:07.232619 13573 net.cpp:260] Setting up ctx_output2/relu_mbox_loc_flat I1106 16:38:07.232623 13573 net.cpp:267] TRAIN Top shape for layer 67 'ctx_output2/relu_mbox_loc_flat' 1 5760 (5760) I1106 16:38:07.232626 13573 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf' of type 'Convolution' I1106 16:38:07.232630 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.232638 13573 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf (68) I1106 16:38:07.232641 13573 net.cpp:572] ctx_output2/relu_mbox_conf <- ctx_output2_ctx_output2/relu_0_split_1 I1106 16:38:07.232645 13573 net.cpp:542] ctx_output2/relu_mbox_conf -> ctx_output2/relu_mbox_conf I1106 16:38:07.232807 13573 net.cpp:260] Setting up ctx_output2/relu_mbox_conf I1106 16:38:07.232813 13573 net.cpp:267] TRAIN Top shape for layer 68 'ctx_output2/relu_mbox_conf' 1 12 10 24 (2880) I1106 16:38:07.232818 13573 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:07.232830 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.232836 13573 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf_perm (69) I1106 16:38:07.232841 13573 net.cpp:572] ctx_output2/relu_mbox_conf_perm <- ctx_output2/relu_mbox_conf I1106 16:38:07.232843 13573 net.cpp:542] ctx_output2/relu_mbox_conf_perm -> ctx_output2/relu_mbox_conf_perm I1106 16:38:07.232904 13573 net.cpp:260] Setting up ctx_output2/relu_mbox_conf_perm I1106 16:38:07.232909 13573 net.cpp:267] TRAIN Top shape for layer 69 'ctx_output2/relu_mbox_conf_perm' 1 10 24 12 (2880) I1106 16:38:07.232913 13573 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:07.232923 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.232926 13573 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf_flat (70) I1106 16:38:07.232928 13573 net.cpp:572] ctx_output2/relu_mbox_conf_flat <- ctx_output2/relu_mbox_conf_perm I1106 16:38:07.232933 13573 net.cpp:542] ctx_output2/relu_mbox_conf_flat -> ctx_output2/relu_mbox_conf_flat I1106 16:38:07.232970 13573 net.cpp:260] Setting up ctx_output2/relu_mbox_conf_flat I1106 16:38:07.232975 13573 net.cpp:267] TRAIN Top shape for layer 70 'ctx_output2/relu_mbox_conf_flat' 1 2880 (2880) I1106 16:38:07.232978 13573 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:07.232988 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.232993 13573 net.cpp:200] Created Layer ctx_output2/relu_mbox_priorbox (71) I1106 16:38:07.233001 13573 net.cpp:572] ctx_output2/relu_mbox_priorbox <- ctx_output2_ctx_output2/relu_0_split_2 I1106 16:38:07.233006 13573 net.cpp:572] ctx_output2/relu_mbox_priorbox <- data_data_0_split_2 I1106 16:38:07.233009 13573 net.cpp:542] ctx_output2/relu_mbox_priorbox -> ctx_output2/relu_mbox_priorbox I1106 16:38:07.233024 13573 net.cpp:260] Setting up ctx_output2/relu_mbox_priorbox I1106 16:38:07.233028 13573 net.cpp:267] TRAIN Top shape for layer 71 'ctx_output2/relu_mbox_priorbox' 1 2 5760 (11520) I1106 16:38:07.233031 13573 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc' of type 'Convolution' I1106 16:38:07.233033 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.233040 13573 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc (72) I1106 16:38:07.233043 13573 net.cpp:572] ctx_output3/relu_mbox_loc <- ctx_output3_ctx_output3/relu_0_split_0 I1106 16:38:07.233047 13573 net.cpp:542] ctx_output3/relu_mbox_loc -> ctx_output3/relu_mbox_loc I1106 16:38:07.233245 13573 net.cpp:260] Setting up ctx_output3/relu_mbox_loc I1106 16:38:07.233253 13573 net.cpp:267] TRAIN Top shape for layer 72 'ctx_output3/relu_mbox_loc' 1 24 5 12 (1440) I1106 16:38:07.233258 13573 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:07.233260 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.233268 13573 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc_perm (73) I1106 16:38:07.233270 13573 net.cpp:572] ctx_output3/relu_mbox_loc_perm <- ctx_output3/relu_mbox_loc I1106 16:38:07.233273 13573 net.cpp:542] ctx_output3/relu_mbox_loc_perm -> ctx_output3/relu_mbox_loc_perm I1106 16:38:07.233330 13573 net.cpp:260] Setting up ctx_output3/relu_mbox_loc_perm I1106 16:38:07.233335 13573 net.cpp:267] TRAIN Top shape for layer 73 'ctx_output3/relu_mbox_loc_perm' 1 5 12 24 (1440) I1106 16:38:07.233336 13573 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:07.233347 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.233352 13573 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc_flat (74) I1106 16:38:07.233355 13573 net.cpp:572] ctx_output3/relu_mbox_loc_flat <- ctx_output3/relu_mbox_loc_perm I1106 16:38:07.233358 13573 net.cpp:542] ctx_output3/relu_mbox_loc_flat -> ctx_output3/relu_mbox_loc_flat I1106 16:38:07.233398 13573 net.cpp:260] Setting up ctx_output3/relu_mbox_loc_flat I1106 16:38:07.233402 13573 net.cpp:267] TRAIN Top shape for layer 74 'ctx_output3/relu_mbox_loc_flat' 1 1440 (1440) I1106 16:38:07.233405 13573 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf' of type 'Convolution' I1106 16:38:07.233408 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.233417 13573 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf (75) I1106 16:38:07.233422 13573 net.cpp:572] ctx_output3/relu_mbox_conf <- ctx_output3_ctx_output3/relu_0_split_1 I1106 16:38:07.233431 13573 net.cpp:542] ctx_output3/relu_mbox_conf -> ctx_output3/relu_mbox_conf I1106 16:38:07.233597 13573 net.cpp:260] Setting up ctx_output3/relu_mbox_conf I1106 16:38:07.233603 13573 net.cpp:267] TRAIN Top shape for layer 75 'ctx_output3/relu_mbox_conf' 1 12 5 12 (720) I1106 16:38:07.233606 13573 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:07.233616 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.233624 13573 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf_perm (76) I1106 16:38:07.233628 13573 net.cpp:572] ctx_output3/relu_mbox_conf_perm <- ctx_output3/relu_mbox_conf I1106 16:38:07.233630 13573 net.cpp:542] ctx_output3/relu_mbox_conf_perm -> ctx_output3/relu_mbox_conf_perm I1106 16:38:07.233691 13573 net.cpp:260] Setting up ctx_output3/relu_mbox_conf_perm I1106 16:38:07.233697 13573 net.cpp:267] TRAIN Top shape for layer 76 'ctx_output3/relu_mbox_conf_perm' 1 5 12 12 (720) I1106 16:38:07.233700 13573 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:07.233702 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.233712 13573 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf_flat (77) I1106 16:38:07.233716 13573 net.cpp:572] ctx_output3/relu_mbox_conf_flat <- ctx_output3/relu_mbox_conf_perm I1106 16:38:07.233719 13573 net.cpp:542] ctx_output3/relu_mbox_conf_flat -> ctx_output3/relu_mbox_conf_flat I1106 16:38:07.233762 13573 net.cpp:260] Setting up ctx_output3/relu_mbox_conf_flat I1106 16:38:07.233767 13573 net.cpp:267] TRAIN Top shape for layer 77 'ctx_output3/relu_mbox_conf_flat' 1 720 (720) I1106 16:38:07.233770 13573 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:07.233772 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.233777 13573 net.cpp:200] Created Layer ctx_output3/relu_mbox_priorbox (78) I1106 16:38:07.233779 13573 net.cpp:572] ctx_output3/relu_mbox_priorbox <- ctx_output3_ctx_output3/relu_0_split_2 I1106 16:38:07.233783 13573 net.cpp:572] ctx_output3/relu_mbox_priorbox <- data_data_0_split_3 I1106 16:38:07.233786 13573 net.cpp:542] ctx_output3/relu_mbox_priorbox -> ctx_output3/relu_mbox_priorbox I1106 16:38:07.233800 13573 net.cpp:260] Setting up ctx_output3/relu_mbox_priorbox I1106 16:38:07.233804 13573 net.cpp:267] TRAIN Top shape for layer 78 'ctx_output3/relu_mbox_priorbox' 1 2 1440 (2880) I1106 16:38:07.233808 13573 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc' of type 'Convolution' I1106 16:38:07.233810 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.233819 13573 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc (79) I1106 16:38:07.233820 13573 net.cpp:572] ctx_output4/relu_mbox_loc <- ctx_output4_ctx_output4/relu_0_split_0 I1106 16:38:07.233824 13573 net.cpp:542] ctx_output4/relu_mbox_loc -> ctx_output4/relu_mbox_loc I1106 16:38:07.234006 13573 net.cpp:260] Setting up ctx_output4/relu_mbox_loc I1106 16:38:07.234012 13573 net.cpp:267] TRAIN Top shape for layer 79 'ctx_output4/relu_mbox_loc' 1 16 3 6 (288) I1106 16:38:07.234017 13573 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:07.234020 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.234025 13573 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc_perm (80) I1106 16:38:07.234028 13573 net.cpp:572] ctx_output4/relu_mbox_loc_perm <- ctx_output4/relu_mbox_loc I1106 16:38:07.234031 13573 net.cpp:542] ctx_output4/relu_mbox_loc_perm -> ctx_output4/relu_mbox_loc_perm I1106 16:38:07.234086 13573 net.cpp:260] Setting up ctx_output4/relu_mbox_loc_perm I1106 16:38:07.234091 13573 net.cpp:267] TRAIN Top shape for layer 80 'ctx_output4/relu_mbox_loc_perm' 1 3 6 16 (288) I1106 16:38:07.234093 13573 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:07.234102 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.234107 13573 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc_flat (81) I1106 16:38:07.234110 13573 net.cpp:572] ctx_output4/relu_mbox_loc_flat <- ctx_output4/relu_mbox_loc_perm I1106 16:38:07.234113 13573 net.cpp:542] ctx_output4/relu_mbox_loc_flat -> ctx_output4/relu_mbox_loc_flat I1106 16:38:07.234156 13573 net.cpp:260] Setting up ctx_output4/relu_mbox_loc_flat I1106 16:38:07.234160 13573 net.cpp:267] TRAIN Top shape for layer 81 'ctx_output4/relu_mbox_loc_flat' 1 288 (288) I1106 16:38:07.234163 13573 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf' of type 'Convolution' I1106 16:38:07.234166 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.234175 13573 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf (82) I1106 16:38:07.234179 13573 net.cpp:572] ctx_output4/relu_mbox_conf <- ctx_output4_ctx_output4/relu_0_split_1 I1106 16:38:07.234181 13573 net.cpp:542] ctx_output4/relu_mbox_conf -> ctx_output4/relu_mbox_conf I1106 16:38:07.234344 13573 net.cpp:260] Setting up ctx_output4/relu_mbox_conf I1106 16:38:07.234349 13573 net.cpp:267] TRAIN Top shape for layer 82 'ctx_output4/relu_mbox_conf' 1 8 3 6 (144) I1106 16:38:07.234354 13573 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:07.234365 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.234376 13573 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf_perm (83) I1106 16:38:07.234383 13573 net.cpp:572] ctx_output4/relu_mbox_conf_perm <- ctx_output4/relu_mbox_conf I1106 16:38:07.234387 13573 net.cpp:542] ctx_output4/relu_mbox_conf_perm -> ctx_output4/relu_mbox_conf_perm I1106 16:38:07.234445 13573 net.cpp:260] Setting up ctx_output4/relu_mbox_conf_perm I1106 16:38:07.234450 13573 net.cpp:267] TRAIN Top shape for layer 83 'ctx_output4/relu_mbox_conf_perm' 1 3 6 8 (144) I1106 16:38:07.234452 13573 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:07.234455 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.234459 13573 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf_flat (84) I1106 16:38:07.234462 13573 net.cpp:572] ctx_output4/relu_mbox_conf_flat <- ctx_output4/relu_mbox_conf_perm I1106 16:38:07.234464 13573 net.cpp:542] ctx_output4/relu_mbox_conf_flat -> ctx_output4/relu_mbox_conf_flat I1106 16:38:07.234503 13573 net.cpp:260] Setting up ctx_output4/relu_mbox_conf_flat I1106 16:38:07.234508 13573 net.cpp:267] TRAIN Top shape for layer 84 'ctx_output4/relu_mbox_conf_flat' 1 144 (144) I1106 16:38:07.234510 13573 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:07.234513 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.234517 13573 net.cpp:200] Created Layer ctx_output4/relu_mbox_priorbox (85) I1106 16:38:07.234520 13573 net.cpp:572] ctx_output4/relu_mbox_priorbox <- ctx_output4_ctx_output4/relu_0_split_2 I1106 16:38:07.234524 13573 net.cpp:572] ctx_output4/relu_mbox_priorbox <- data_data_0_split_4 I1106 16:38:07.234526 13573 net.cpp:542] ctx_output4/relu_mbox_priorbox -> ctx_output4/relu_mbox_priorbox I1106 16:38:07.234540 13573 net.cpp:260] Setting up ctx_output4/relu_mbox_priorbox I1106 16:38:07.234544 13573 net.cpp:267] TRAIN Top shape for layer 85 'ctx_output4/relu_mbox_priorbox' 1 2 288 (576) I1106 16:38:07.234546 13573 layer_factory.hpp:172] Creating layer 'mbox_loc' of type 'Concat' I1106 16:38:07.234549 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.234555 13573 net.cpp:200] Created Layer mbox_loc (86) I1106 16:38:07.234558 13573 net.cpp:572] mbox_loc <- ctx_output1/relu_mbox_loc_flat I1106 16:38:07.234568 13573 net.cpp:572] mbox_loc <- ctx_output2/relu_mbox_loc_flat I1106 16:38:07.234571 13573 net.cpp:572] mbox_loc <- ctx_output3/relu_mbox_loc_flat I1106 16:38:07.234575 13573 net.cpp:572] mbox_loc <- ctx_output4/relu_mbox_loc_flat I1106 16:38:07.234577 13573 net.cpp:542] mbox_loc -> mbox_loc I1106 16:38:07.234591 13573 net.cpp:260] Setting up mbox_loc I1106 16:38:07.234596 13573 net.cpp:267] TRAIN Top shape for layer 86 'mbox_loc' 1 22848 (22848) I1106 16:38:07.234599 13573 layer_factory.hpp:172] Creating layer 'mbox_conf' of type 'Concat' I1106 16:38:07.234601 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.234606 13573 net.cpp:200] Created Layer mbox_conf (87) I1106 16:38:07.234609 13573 net.cpp:572] mbox_conf <- ctx_output1/relu_mbox_conf_flat I1106 16:38:07.234612 13573 net.cpp:572] mbox_conf <- ctx_output2/relu_mbox_conf_flat I1106 16:38:07.234614 13573 net.cpp:572] mbox_conf <- ctx_output3/relu_mbox_conf_flat I1106 16:38:07.234617 13573 net.cpp:572] mbox_conf <- ctx_output4/relu_mbox_conf_flat I1106 16:38:07.234621 13573 net.cpp:542] mbox_conf -> mbox_conf I1106 16:38:07.234632 13573 net.cpp:260] Setting up mbox_conf I1106 16:38:07.234637 13573 net.cpp:267] TRAIN Top shape for layer 87 'mbox_conf' 1 11424 (11424) I1106 16:38:07.234638 13573 layer_factory.hpp:172] Creating layer 'mbox_priorbox' of type 'Concat' I1106 16:38:07.234642 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.234645 13573 net.cpp:200] Created Layer mbox_priorbox (88) I1106 16:38:07.234647 13573 net.cpp:572] mbox_priorbox <- ctx_output1/relu_mbox_priorbox I1106 16:38:07.234652 13573 net.cpp:572] mbox_priorbox <- ctx_output2/relu_mbox_priorbox I1106 16:38:07.234654 13573 net.cpp:572] mbox_priorbox <- ctx_output3/relu_mbox_priorbox I1106 16:38:07.234656 13573 net.cpp:572] mbox_priorbox <- ctx_output4/relu_mbox_priorbox I1106 16:38:07.234658 13573 net.cpp:542] mbox_priorbox -> mbox_priorbox I1106 16:38:07.234673 13573 net.cpp:260] Setting up mbox_priorbox I1106 16:38:07.234676 13573 net.cpp:267] TRAIN Top shape for layer 88 'mbox_priorbox' 1 2 22848 (45696) I1106 16:38:07.234679 13573 layer_factory.hpp:172] Creating layer 'mbox_loss' of type 'MultiBoxLoss' I1106 16:38:07.234681 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.234694 13573 net.cpp:200] Created Layer mbox_loss (89) I1106 16:38:07.234696 13573 net.cpp:572] mbox_loss <- mbox_loc I1106 16:38:07.234699 13573 net.cpp:572] mbox_loss <- mbox_conf I1106 16:38:07.234702 13573 net.cpp:572] mbox_loss <- mbox_priorbox I1106 16:38:07.234704 13573 net.cpp:572] mbox_loss <- label I1106 16:38:07.234707 13573 net.cpp:542] mbox_loss -> mbox_loss I1106 16:38:07.234748 13573 layer_factory.hpp:172] Creating layer 'mbox_loss_smooth_L1_loc' of type 'SmoothL1Loss' I1106 16:38:07.234751 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.234809 13573 layer_factory.hpp:172] Creating layer 'mbox_loss_softmax_conf' of type 'SoftmaxWithLoss' I1106 16:38:07.234813 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.234879 13573 net.cpp:260] Setting up mbox_loss I1106 16:38:07.234884 13573 net.cpp:267] TRAIN Top shape for layer 89 'mbox_loss' (1) I1106 16:38:07.234886 13573 net.cpp:271] with loss weight 1 I1106 16:38:07.234903 13573 net.cpp:336] mbox_loss needs backward computation. I1106 16:38:07.234907 13573 net.cpp:338] mbox_priorbox does not need backward computation. I1106 16:38:07.234912 13573 net.cpp:336] mbox_conf needs backward computation. I1106 16:38:07.234915 13573 net.cpp:336] mbox_loc needs backward computation. I1106 16:38:07.234918 13573 net.cpp:338] ctx_output4/relu_mbox_priorbox does not need backward computation. I1106 16:38:07.234921 13573 net.cpp:336] ctx_output4/relu_mbox_conf_flat needs backward computation. I1106 16:38:07.234923 13573 net.cpp:336] ctx_output4/relu_mbox_conf_perm needs backward computation. I1106 16:38:07.234933 13573 net.cpp:336] ctx_output4/relu_mbox_conf needs backward computation. I1106 16:38:07.234935 13573 net.cpp:336] ctx_output4/relu_mbox_loc_flat needs backward computation. I1106 16:38:07.234946 13573 net.cpp:336] ctx_output4/relu_mbox_loc_perm needs backward computation. I1106 16:38:07.234948 13573 net.cpp:336] ctx_output4/relu_mbox_loc needs backward computation. I1106 16:38:07.234951 13573 net.cpp:338] ctx_output3/relu_mbox_priorbox does not need backward computation. I1106 16:38:07.234961 13573 net.cpp:336] ctx_output3/relu_mbox_conf_flat needs backward computation. I1106 16:38:07.234963 13573 net.cpp:336] ctx_output3/relu_mbox_conf_perm needs backward computation. I1106 16:38:07.234966 13573 net.cpp:336] ctx_output3/relu_mbox_conf needs backward computation. I1106 16:38:07.234967 13573 net.cpp:336] ctx_output3/relu_mbox_loc_flat needs backward computation. I1106 16:38:07.234969 13573 net.cpp:336] ctx_output3/relu_mbox_loc_perm needs backward computation. I1106 16:38:07.234973 13573 net.cpp:336] ctx_output3/relu_mbox_loc needs backward computation. I1106 16:38:07.234975 13573 net.cpp:338] ctx_output2/relu_mbox_priorbox does not need backward computation. I1106 16:38:07.234978 13573 net.cpp:336] ctx_output2/relu_mbox_conf_flat needs backward computation. I1106 16:38:07.234982 13573 net.cpp:336] ctx_output2/relu_mbox_conf_perm needs backward computation. I1106 16:38:07.234984 13573 net.cpp:336] ctx_output2/relu_mbox_conf needs backward computation. I1106 16:38:07.234987 13573 net.cpp:336] ctx_output2/relu_mbox_loc_flat needs backward computation. I1106 16:38:07.234988 13573 net.cpp:336] ctx_output2/relu_mbox_loc_perm needs backward computation. I1106 16:38:07.234992 13573 net.cpp:336] ctx_output2/relu_mbox_loc needs backward computation. I1106 16:38:07.234994 13573 net.cpp:338] ctx_output1/relu_mbox_priorbox does not need backward computation. I1106 16:38:07.234997 13573 net.cpp:336] ctx_output1/relu_mbox_conf_flat needs backward computation. I1106 16:38:07.234999 13573 net.cpp:336] ctx_output1/relu_mbox_conf_perm needs backward computation. I1106 16:38:07.235002 13573 net.cpp:336] ctx_output1/relu_mbox_conf needs backward computation. I1106 16:38:07.235004 13573 net.cpp:336] ctx_output1/relu_mbox_loc_flat needs backward computation. I1106 16:38:07.235008 13573 net.cpp:336] ctx_output1/relu_mbox_loc_perm needs backward computation. I1106 16:38:07.235009 13573 net.cpp:336] ctx_output1/relu_mbox_loc needs backward computation. I1106 16:38:07.235013 13573 net.cpp:338] ctx_output5/relu does not need backward computation. I1106 16:38:07.235016 13573 net.cpp:338] ctx_output5 does not need backward computation. I1106 16:38:07.235019 13573 net.cpp:336] ctx_output4_ctx_output4/relu_0_split needs backward computation. I1106 16:38:07.235021 13573 net.cpp:336] ctx_output4/relu needs backward computation. I1106 16:38:07.235024 13573 net.cpp:336] ctx_output4 needs backward computation. I1106 16:38:07.235028 13573 net.cpp:336] ctx_output3_ctx_output3/relu_0_split needs backward computation. I1106 16:38:07.235031 13573 net.cpp:336] ctx_output3/relu needs backward computation. I1106 16:38:07.235033 13573 net.cpp:336] ctx_output3 needs backward computation. I1106 16:38:07.235035 13573 net.cpp:336] ctx_output2_ctx_output2/relu_0_split needs backward computation. I1106 16:38:07.235038 13573 net.cpp:336] ctx_output2/relu needs backward computation. I1106 16:38:07.235039 13573 net.cpp:336] ctx_output2 needs backward computation. I1106 16:38:07.235041 13573 net.cpp:336] ctx_output1_ctx_output1/relu_0_split needs backward computation. I1106 16:38:07.235044 13573 net.cpp:336] ctx_output1/relu needs backward computation. I1106 16:38:07.235046 13573 net.cpp:336] ctx_output1 needs backward computation. I1106 16:38:07.235049 13573 net.cpp:338] pool8 does not need backward computation. I1106 16:38:07.235052 13573 net.cpp:336] pool7_pool7_0_split needs backward computation. I1106 16:38:07.235054 13573 net.cpp:336] pool7 needs backward computation. I1106 16:38:07.235061 13573 net.cpp:336] pool6_pool6_0_split needs backward computation. I1106 16:38:07.235064 13573 net.cpp:336] pool6 needs backward computation. I1106 16:38:07.235066 13573 net.cpp:336] res5a_branch2b_res5a_branch2b/relu_0_split needs backward computation. I1106 16:38:07.235069 13573 net.cpp:336] res5a_branch2b/relu needs backward computation. I1106 16:38:07.235072 13573 net.cpp:336] res5a_branch2b/bn needs backward computation. I1106 16:38:07.235074 13573 net.cpp:336] res5a_branch2b needs backward computation. I1106 16:38:07.235076 13573 net.cpp:336] res5a_branch2a/relu needs backward computation. I1106 16:38:07.235080 13573 net.cpp:336] res5a_branch2a/bn needs backward computation. I1106 16:38:07.235081 13573 net.cpp:336] res5a_branch2a needs backward computation. I1106 16:38:07.235083 13573 net.cpp:336] pool4 needs backward computation. I1106 16:38:07.235086 13573 net.cpp:336] res4a_branch2b_res4a_branch2b/relu_0_split needs backward computation. I1106 16:38:07.235090 13573 net.cpp:336] res4a_branch2b/relu needs backward computation. I1106 16:38:07.235092 13573 net.cpp:336] res4a_branch2b/bn needs backward computation. I1106 16:38:07.235093 13573 net.cpp:336] res4a_branch2b needs backward computation. I1106 16:38:07.235095 13573 net.cpp:336] res4a_branch2a/relu needs backward computation. I1106 16:38:07.235100 13573 net.cpp:336] res4a_branch2a/bn needs backward computation. I1106 16:38:07.235102 13573 net.cpp:336] res4a_branch2a needs backward computation. I1106 16:38:07.235105 13573 net.cpp:336] pool3 needs backward computation. I1106 16:38:07.235107 13573 net.cpp:336] res3a_branch2b/relu needs backward computation. I1106 16:38:07.235110 13573 net.cpp:336] res3a_branch2b/bn needs backward computation. I1106 16:38:07.235112 13573 net.cpp:336] res3a_branch2b needs backward computation. I1106 16:38:07.235114 13573 net.cpp:336] res3a_branch2a/relu needs backward computation. I1106 16:38:07.235116 13573 net.cpp:336] res3a_branch2a/bn needs backward computation. I1106 16:38:07.235118 13573 net.cpp:336] res3a_branch2a needs backward computation. I1106 16:38:07.235121 13573 net.cpp:336] pool2 needs backward computation. I1106 16:38:07.235124 13573 net.cpp:336] res2a_branch2b/relu needs backward computation. I1106 16:38:07.235126 13573 net.cpp:336] res2a_branch2b/bn needs backward computation. I1106 16:38:07.235128 13573 net.cpp:336] res2a_branch2b needs backward computation. I1106 16:38:07.235131 13573 net.cpp:336] res2a_branch2a/relu needs backward computation. I1106 16:38:07.235134 13573 net.cpp:336] res2a_branch2a/bn needs backward computation. I1106 16:38:07.235136 13573 net.cpp:336] res2a_branch2a needs backward computation. I1106 16:38:07.235139 13573 net.cpp:336] pool1 needs backward computation. I1106 16:38:07.235141 13573 net.cpp:336] conv1b/relu needs backward computation. I1106 16:38:07.235144 13573 net.cpp:336] conv1b/bn needs backward computation. I1106 16:38:07.235146 13573 net.cpp:336] conv1b needs backward computation. I1106 16:38:07.235147 13573 net.cpp:336] conv1a/relu needs backward computation. I1106 16:38:07.235155 13573 net.cpp:336] conv1a/bn needs backward computation. I1106 16:38:07.235158 13573 net.cpp:336] conv1a needs backward computation. I1106 16:38:07.235162 13573 net.cpp:338] data/bias does not need backward computation. I1106 16:38:07.235164 13573 net.cpp:338] data_data_0_split does not need backward computation. I1106 16:38:07.235168 13573 net.cpp:338] data does not need backward computation. I1106 16:38:07.235169 13573 net.cpp:380] This network produces output ctx_output5 I1106 16:38:07.235172 13573 net.cpp:380] This network produces output mbox_loss I1106 16:38:07.235235 13573 net.cpp:403] Top memory (TRAIN) required for data: 126663336 diff: 126663336 I1106 16:38:07.235239 13573 net.cpp:406] Bottom memory (TRAIN) required for data: 126657184 diff: 126657184 I1106 16:38:07.235241 13573 net.cpp:409] Shared (in-place) memory (TRAIN) by data: 62263296 diff: 62263296 I1106 16:38:07.235242 13573 net.cpp:412] Parameters memory (TRAIN) required for data: 11946688 diff: 11946688 I1106 16:38:07.235249 13573 net.cpp:415] Parameters shared memory (TRAIN) by data: 0 diff: 0 I1106 16:38:07.235251 13573 net.cpp:421] Network initialization done. I1106 16:38:07.235802 13573 solver.cpp:175] Creating test net (#0) specified by test_net file: training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/sparse/test.prototxt I1106 16:38:07.236219 13573 net.cpp:80] Initializing net from parameters: name: "ssdJacintoNetV2_test" state { phase: TEST } layer { name: "data" type: "AnnotatedData" top: "data" top: "label" include { phase: TEST } transform_param { mean_value: 0 mean_value: 0 mean_value: 0 force_color: false resize_param { prob: 1 resize_mode: WARP height: 320 width: 768 interp_mode: LINEAR } crop_h: 320 crop_w: 768 } data_param { source: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_test_lmdb" batch_size: 8 backend: LMDB threads: 4 parser_threads: 4 } annotated_data_param { batch_sampler { } label_map_file: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/labelmap.prototxt" } } layer { name: "data/bias" type: "Bias" bottom: "data" top: "data/bias" param { lr_mult: 0 decay_mult: 0 } bias_param { filler { type: "constant" value: -128 } } } layer { name: "conv1a" type: "Convolution" bottom: "data/bias" top: "conv1a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 32 bias_term: true pad: 2 kernel_size: 5 group: 1 stride: 2 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "conv1a/bn" type: "BatchNorm" bottom: "conv1a" top: "conv1a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "conv1a/relu" type: "ReLU" bottom: "conv1a" top: "conv1a" } layer { name: "conv1b" type: "Convolution" bottom: "conv1a" top: "conv1b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 32 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "conv1b/bn" type: "BatchNorm" bottom: "conv1b" top: "conv1b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "conv1b/relu" type: "ReLU" bottom: "conv1b" top: "conv1b" } layer { name: "pool1" type: "Pooling" bottom: "conv1b" top: "pool1" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res2a_branch2a" type: "Convolution" bottom: "pool1" top: "res2a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 64 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res2a_branch2a/bn" type: "BatchNorm" bottom: "res2a_branch2a" top: "res2a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res2a_branch2a/relu" type: "ReLU" bottom: "res2a_branch2a" top: "res2a_branch2a" } layer { name: "res2a_branch2b" type: "Convolution" bottom: "res2a_branch2a" top: "res2a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 64 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res2a_branch2b/bn" type: "BatchNorm" bottom: "res2a_branch2b" top: "res2a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res2a_branch2b/relu" type: "ReLU" bottom: "res2a_branch2b" top: "res2a_branch2b" } layer { name: "pool2" type: "Pooling" bottom: "res2a_branch2b" top: "pool2" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res3a_branch2a" type: "Convolution" bottom: "pool2" top: "res3a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 128 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res3a_branch2a/bn" type: "BatchNorm" bottom: "res3a_branch2a" top: "res3a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res3a_branch2a/relu" type: "ReLU" bottom: "res3a_branch2a" top: "res3a_branch2a" } layer { name: "res3a_branch2b" type: "Convolution" bottom: "res3a_branch2a" top: "res3a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 128 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res3a_branch2b/bn" type: "BatchNorm" bottom: "res3a_branch2b" top: "res3a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res3a_branch2b/relu" type: "ReLU" bottom: "res3a_branch2b" top: "res3a_branch2b" } layer { name: "pool3" type: "Pooling" bottom: "res3a_branch2b" top: "pool3" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res4a_branch2a" type: "Convolution" bottom: "pool3" top: "res4a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res4a_branch2a/bn" type: "BatchNorm" bottom: "res4a_branch2a" top: "res4a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res4a_branch2a/relu" type: "ReLU" bottom: "res4a_branch2a" top: "res4a_branch2a" } layer { name: "res4a_branch2b" type: "Convolution" bottom: "res4a_branch2a" top: "res4a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res4a_branch2b/bn" type: "BatchNorm" bottom: "res4a_branch2b" top: "res4a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res4a_branch2b/relu" type: "ReLU" bottom: "res4a_branch2b" top: "res4a_branch2b" } layer { name: "pool4" type: "Pooling" bottom: "res4a_branch2b" top: "pool4" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res5a_branch2a" type: "Convolution" bottom: "pool4" top: "res5a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 512 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res5a_branch2a/bn" type: "BatchNorm" bottom: "res5a_branch2a" top: "res5a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res5a_branch2a/relu" type: "ReLU" bottom: "res5a_branch2a" top: "res5a_branch2a" } layer { name: "res5a_branch2b" type: "Convolution" bottom: "res5a_branch2a" top: "res5a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 512 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res5a_branch2b/bn" type: "BatchNorm" bottom: "res5a_branch2b" top: "res5a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res5a_branch2b/relu" type: "ReLU" bottom: "res5a_branch2b" top: "res5a_branch2b" } layer { name: "pool6" type: "Pooling" bottom: "res5a_branch2b" top: "pool6" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "pool7" type: "Pooling" bottom: "pool6" top: "pool7" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "pool8" type: "Pooling" bottom: "pool7" top: "pool8" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "ctx_output1" type: "Convolution" bottom: "res4a_branch2b" top: "ctx_output1" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu" type: "ReLU" bottom: "ctx_output1" top: "ctx_output1" } layer { name: "ctx_output2" type: "Convolution" bottom: "res5a_branch2b" top: "ctx_output2" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu" type: "ReLU" bottom: "ctx_output2" top: "ctx_output2" } layer { name: "ctx_output3" type: "Convolution" bottom: "pool6" top: "ctx_output3" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu" type: "ReLU" bottom: "ctx_output3" top: "ctx_output3" } layer { name: "ctx_output4" type: "Convolution" bottom: "pool7" top: "ctx_output4" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu" type: "ReLU" bottom: "ctx_output4" top: "ctx_output4" } layer { name: "ctx_output5" type: "Convolution" bottom: "pool8" top: "ctx_output5" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output5/relu" type: "ReLU" bottom: "ctx_output5" top: "ctx_output5" } layer { name: "ctx_output1/relu_mbox_loc" type: "Convolution" bottom: "ctx_output1" top: "ctx_output1/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 16 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output1/relu_mbox_loc" top: "ctx_output1/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output1/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output1/relu_mbox_loc_perm" top: "ctx_output1/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output1/relu_mbox_conf" type: "Convolution" bottom: "ctx_output1" top: "ctx_output1/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 8 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output1/relu_mbox_conf" top: "ctx_output1/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output1/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output1/relu_mbox_conf_perm" top: "ctx_output1/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output1/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output1" bottom: "data" top: "ctx_output1/relu_mbox_priorbox" prior_box_param { min_size: 14.72 max_size: 36.8 aspect_ratio: 2 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output2/relu_mbox_loc" type: "Convolution" bottom: "ctx_output2" top: "ctx_output2/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 24 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output2/relu_mbox_loc" top: "ctx_output2/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output2/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output2/relu_mbox_loc_perm" top: "ctx_output2/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output2/relu_mbox_conf" type: "Convolution" bottom: "ctx_output2" top: "ctx_output2/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 12 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output2/relu_mbox_conf" top: "ctx_output2/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output2/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output2/relu_mbox_conf_perm" top: "ctx_output2/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output2/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output2" bottom: "data" top: "ctx_output2/relu_mbox_priorbox" prior_box_param { min_size: 36.8 max_size: 132.48 aspect_ratio: 2 aspect_ratio: 3 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output3/relu_mbox_loc" type: "Convolution" bottom: "ctx_output3" top: "ctx_output3/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 24 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output3/relu_mbox_loc" top: "ctx_output3/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output3/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output3/relu_mbox_loc_perm" top: "ctx_output3/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output3/relu_mbox_conf" type: "Convolution" bottom: "ctx_output3" top: "ctx_output3/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 12 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output3/relu_mbox_conf" top: "ctx_output3/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output3/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output3/relu_mbox_conf_perm" top: "ctx_output3/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output3/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output3" bottom: "data" top: "ctx_output3/relu_mbox_priorbox" prior_box_param { min_size: 132.48 max_size: 228.16 aspect_ratio: 2 aspect_ratio: 3 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output4/relu_mbox_loc" type: "Convolution" bottom: "ctx_output4" top: "ctx_output4/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 16 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output4/relu_mbox_loc" top: "ctx_output4/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output4/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output4/relu_mbox_loc_perm" top: "ctx_output4/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output4/relu_mbox_conf" type: "Convolution" bottom: "ctx_output4" top: "ctx_output4/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 8 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output4/relu_mbox_conf" top: "ctx_output4/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output4/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output4/relu_mbox_conf_perm" top: "ctx_output4/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output4/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output4" bottom: "data" top: "ctx_output4/relu_mbox_priorbox" prior_box_param { min_size: 228.16 max_size: 323.84 aspect_ratio: 2 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "mbox_loc" type: "Concat" bottom: "ctx_output1/relu_mbox_loc_flat" bottom: "ctx_output2/relu_mbox_loc_flat" bottom: "ctx_output3/relu_mbox_loc_flat" bottom: "ctx_output4/relu_mbox_loc_flat" top: "mbox_loc" concat_param { axis: 1 } } layer { name: "mbox_conf" type: "Concat" bottom: "ctx_output1/relu_mbox_conf_flat" bottom: "ctx_output2/relu_mbox_conf_flat" bottom: "ctx_output3/relu_mbox_conf_flat" bottom: "ctx_output4/relu_mbox_conf_flat" top: "mbox_conf" concat_param { axis: 1 } } layer { name: "mbox_priorbox" type: "Concat" bottom: "ctx_output1/relu_mbox_priorbox" bottom: "ctx_output2/relu_mbox_priorbox" bottom: "ctx_output3/relu_mbox_priorbox" bottom: "ctx_output4/relu_mbox_priorbox" top: "mbox_priorbox" concat_param { axis: 2 } } layer { name: "mbox_conf_reshape" type: "Reshape" bottom: "mbox_conf" top: "mbox_conf_reshape" reshape_param { shape { dim: 0 dim: -1 dim: 2 } } } layer { name: "mbox_conf_softmax" type: "Softmax" bottom: "mbox_conf_reshape" top: "mbox_conf_softmax" softmax_param { axis: 2 } } layer { name: "mbox_conf_flatten" type: "Flatten" bottom: "mbox_conf_softmax" top: "mbox_conf_flatten" flatten_param { axis: 1 } } layer { name: "detection_out" type: "DetectionOutput" bottom: "mbox_loc" bottom: "mbox_conf_flatten" bottom: "mbox_priorbox" top: "detection_out" include { phase: TEST } detection_output_param { num_classes: 2 share_location: true background_label_id: 0 nms_param { nms_threshold: 0.45 top_k: 400 } save_output_param { output_directory: "" output_name_prefix: "comp4_det_test_" output_format: "VOC" label_map_file: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/labelmap.prototxt" name_size_file: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/test_name_size.txt" num_test_image: 24 } code_type: CENTER_SIZE keep_top_k: 200 confidence_threshold: 0.01 } } layer { name: "detection_eval" type: "DetectionEvaluate" bottom: "detection_out" bottom: "label" top: "detection_eval" include { phase: TEST } detection_evaluate_param { num_classes: 2 background_label_id: 0 overlap_threshold: 0.5 evaluate_difficult_gt: false name_size_file: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/test_name_size.txt" } } I1106 16:38:07.236465 13573 net.cpp:110] Using FLOAT as default forward math type I1106 16:38:07.236470 13573 net.cpp:116] Using FLOAT as default backward math type I1106 16:38:07.236474 13573 layer_factory.hpp:172] Creating layer 'data' of type 'AnnotatedData' I1106 16:38:07.236476 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.236488 13573 internal_thread.cpp:19] Starting 1 internal thread(s) on device 0 I1106 16:38:07.236526 13573 net.cpp:200] Created Layer data (0) I1106 16:38:07.236531 13573 net.cpp:542] data -> data I1106 16:38:07.236536 13573 net.cpp:542] data -> label I1106 16:38:07.236541 13573 data_reader.cpp:58] Data Reader threads: 1, out queues: 1, depth: 8 I1106 16:38:07.236845 13573 internal_thread.cpp:19] Starting 1 internal thread(s) on device 0 I1106 16:38:07.237457 13604 db_lmdb.cpp:36] Opened lmdb /home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_test_lmdb I1106 16:38:07.237891 13573 annotated_data_layer.cpp:105] output data size: 8,3,320,768 I1106 16:38:07.237936 13573 annotated_data_layer.cpp:150] (0) Output data size: 8, 3, 320, 768 I1106 16:38:07.237982 13573 internal_thread.cpp:19] Starting 1 internal thread(s) on device 0 I1106 16:38:07.238019 13573 net.cpp:260] Setting up data I1106 16:38:07.238025 13573 net.cpp:267] TEST Top shape for layer 0 'data' 8 3 320 768 (5898240) I1106 16:38:07.238029 13573 net.cpp:267] TEST Top shape for layer 0 'data' 1 1 2 8 (16) I1106 16:38:07.238306 13573 layer_factory.hpp:172] Creating layer 'data_data_0_split' of type 'Split' I1106 16:38:07.238312 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.238312 13605 data_layer.cpp:105] (0) Parser threads: 1 I1106 16:38:07.238317 13573 net.cpp:200] Created Layer data_data_0_split (1) I1106 16:38:07.238328 13605 data_layer.cpp:107] (0) Transformer threads: 1 I1106 16:38:07.238332 13573 net.cpp:572] data_data_0_split <- data I1106 16:38:07.238335 13573 net.cpp:542] data_data_0_split -> data_data_0_split_0 I1106 16:38:07.238340 13573 net.cpp:542] data_data_0_split -> data_data_0_split_1 I1106 16:38:07.238344 13573 net.cpp:542] data_data_0_split -> data_data_0_split_2 I1106 16:38:07.238348 13573 net.cpp:542] data_data_0_split -> data_data_0_split_3 I1106 16:38:07.238351 13573 net.cpp:542] data_data_0_split -> data_data_0_split_4 I1106 16:38:07.238402 13573 net.cpp:260] Setting up data_data_0_split I1106 16:38:07.238407 13573 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 8 3 320 768 (5898240) I1106 16:38:07.238411 13573 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 8 3 320 768 (5898240) I1106 16:38:07.238413 13573 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 8 3 320 768 (5898240) I1106 16:38:07.238416 13573 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 8 3 320 768 (5898240) I1106 16:38:07.238420 13573 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 8 3 320 768 (5898240) I1106 16:38:07.238422 13573 layer_factory.hpp:172] Creating layer 'data/bias' of type 'Bias' I1106 16:38:07.238425 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.238440 13573 net.cpp:200] Created Layer data/bias (2) I1106 16:38:07.238442 13573 net.cpp:572] data/bias <- data_data_0_split_0 I1106 16:38:07.238445 13573 net.cpp:542] data/bias -> data/bias I1106 16:38:07.239110 13573 net.cpp:260] Setting up data/bias I1106 16:38:07.239120 13573 net.cpp:267] TEST Top shape for layer 2 'data/bias' 8 3 320 768 (5898240) I1106 16:38:07.239130 13573 layer_factory.hpp:172] Creating layer 'conv1a' of type 'Convolution' I1106 16:38:07.239133 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.239151 13573 net.cpp:200] Created Layer conv1a (3) I1106 16:38:07.239153 13573 net.cpp:572] conv1a <- data/bias I1106 16:38:07.239156 13573 net.cpp:542] conv1a -> conv1a I1106 16:38:07.244136 13573 net.cpp:260] Setting up conv1a I1106 16:38:07.244190 13573 net.cpp:267] TEST Top shape for layer 3 'conv1a' 8 32 160 384 (15728640) I1106 16:38:07.244213 13573 layer_factory.hpp:172] Creating layer 'conv1a/bn' of type 'BatchNorm' I1106 16:38:07.244223 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.244256 13573 net.cpp:200] Created Layer conv1a/bn (4) I1106 16:38:07.244266 13573 net.cpp:572] conv1a/bn <- conv1a I1106 16:38:07.244276 13573 net.cpp:527] conv1a/bn -> conv1a (in-place) I1106 16:38:07.244776 13573 net.cpp:260] Setting up conv1a/bn I1106 16:38:07.244786 13573 net.cpp:267] TEST Top shape for layer 4 'conv1a/bn' 8 32 160 384 (15728640) I1106 16:38:07.244796 13573 layer_factory.hpp:172] Creating layer 'conv1a/relu' of type 'ReLU' I1106 16:38:07.244801 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.244807 13573 net.cpp:200] Created Layer conv1a/relu (5) I1106 16:38:07.244808 13573 net.cpp:572] conv1a/relu <- conv1a I1106 16:38:07.244812 13573 net.cpp:527] conv1a/relu -> conv1a (in-place) I1106 16:38:07.244853 13573 net.cpp:260] Setting up conv1a/relu I1106 16:38:07.244863 13573 net.cpp:267] TEST Top shape for layer 5 'conv1a/relu' 8 32 160 384 (15728640) I1106 16:38:07.244875 13573 layer_factory.hpp:172] Creating layer 'conv1b' of type 'Convolution' I1106 16:38:07.244880 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.244915 13573 net.cpp:200] Created Layer conv1b (6) I1106 16:38:07.244918 13573 net.cpp:572] conv1b <- conv1a I1106 16:38:07.244921 13573 net.cpp:542] conv1b -> conv1b I1106 16:38:07.245200 13573 net.cpp:260] Setting up conv1b I1106 16:38:07.245208 13573 net.cpp:267] TEST Top shape for layer 6 'conv1b' 8 32 160 384 (15728640) I1106 16:38:07.245214 13573 layer_factory.hpp:172] Creating layer 'conv1b/bn' of type 'BatchNorm' I1106 16:38:07.245218 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.245223 13573 net.cpp:200] Created Layer conv1b/bn (7) I1106 16:38:07.245225 13573 net.cpp:572] conv1b/bn <- conv1b I1106 16:38:07.245227 13573 net.cpp:527] conv1b/bn -> conv1b (in-place) I1106 16:38:07.245599 13573 net.cpp:260] Setting up conv1b/bn I1106 16:38:07.245607 13573 net.cpp:267] TEST Top shape for layer 7 'conv1b/bn' 8 32 160 384 (15728640) I1106 16:38:07.245615 13573 layer_factory.hpp:172] Creating layer 'conv1b/relu' of type 'ReLU' I1106 16:38:07.245618 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.245625 13573 net.cpp:200] Created Layer conv1b/relu (8) I1106 16:38:07.245626 13573 net.cpp:572] conv1b/relu <- conv1b I1106 16:38:07.245630 13573 net.cpp:527] conv1b/relu -> conv1b (in-place) I1106 16:38:07.245635 13573 net.cpp:260] Setting up conv1b/relu I1106 16:38:07.245637 13573 net.cpp:267] TEST Top shape for layer 8 'conv1b/relu' 8 32 160 384 (15728640) I1106 16:38:07.245640 13573 layer_factory.hpp:172] Creating layer 'pool1' of type 'Pooling' I1106 16:38:07.245642 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.245649 13573 net.cpp:200] Created Layer pool1 (9) I1106 16:38:07.245652 13573 net.cpp:572] pool1 <- conv1b I1106 16:38:07.245654 13573 net.cpp:542] pool1 -> pool1 I1106 16:38:07.245699 13573 net.cpp:260] Setting up pool1 I1106 16:38:07.245702 13573 net.cpp:267] TEST Top shape for layer 9 'pool1' 8 32 80 192 (3932160) I1106 16:38:07.245705 13573 layer_factory.hpp:172] Creating layer 'res2a_branch2a' of type 'Convolution' I1106 16:38:07.245708 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.245723 13573 net.cpp:200] Created Layer res2a_branch2a (10) I1106 16:38:07.245724 13573 net.cpp:572] res2a_branch2a <- pool1 I1106 16:38:07.245728 13573 net.cpp:542] res2a_branch2a -> res2a_branch2a I1106 16:38:07.246068 13573 net.cpp:260] Setting up res2a_branch2a I1106 16:38:07.246074 13573 net.cpp:267] TEST Top shape for layer 10 'res2a_branch2a' 8 64 80 192 (7864320) I1106 16:38:07.246084 13573 layer_factory.hpp:172] Creating layer 'res2a_branch2a/bn' of type 'BatchNorm' I1106 16:38:07.246088 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.246093 13573 net.cpp:200] Created Layer res2a_branch2a/bn (11) I1106 16:38:07.246096 13573 net.cpp:572] res2a_branch2a/bn <- res2a_branch2a I1106 16:38:07.246098 13573 net.cpp:527] res2a_branch2a/bn -> res2a_branch2a (in-place) I1106 16:38:07.246325 13573 net.cpp:260] Setting up res2a_branch2a/bn I1106 16:38:07.246330 13573 net.cpp:267] TEST Top shape for layer 11 'res2a_branch2a/bn' 8 64 80 192 (7864320) I1106 16:38:07.246335 13573 layer_factory.hpp:172] Creating layer 'res2a_branch2a/relu' of type 'ReLU' I1106 16:38:07.246337 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.246342 13573 net.cpp:200] Created Layer res2a_branch2a/relu (12) I1106 16:38:07.246345 13573 net.cpp:572] res2a_branch2a/relu <- res2a_branch2a I1106 16:38:07.246347 13573 net.cpp:527] res2a_branch2a/relu -> res2a_branch2a (in-place) I1106 16:38:07.246373 13573 net.cpp:260] Setting up res2a_branch2a/relu I1106 16:38:07.246383 13573 net.cpp:267] TEST Top shape for layer 12 'res2a_branch2a/relu' 8 64 80 192 (7864320) I1106 16:38:07.246389 13573 layer_factory.hpp:172] Creating layer 'res2a_branch2b' of type 'Convolution' I1106 16:38:07.246392 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.246402 13573 net.cpp:200] Created Layer res2a_branch2b (13) I1106 16:38:07.246407 13573 net.cpp:572] res2a_branch2b <- res2a_branch2a I1106 16:38:07.246408 13573 net.cpp:542] res2a_branch2b -> res2a_branch2b I1106 16:38:07.246639 13573 net.cpp:260] Setting up res2a_branch2b I1106 16:38:07.246646 13573 net.cpp:267] TEST Top shape for layer 13 'res2a_branch2b' 8 64 80 192 (7864320) I1106 16:38:07.246651 13573 layer_factory.hpp:172] Creating layer 'res2a_branch2b/bn' of type 'BatchNorm' I1106 16:38:07.246654 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.246659 13573 net.cpp:200] Created Layer res2a_branch2b/bn (14) I1106 16:38:07.246662 13573 net.cpp:572] res2a_branch2b/bn <- res2a_branch2b I1106 16:38:07.246666 13573 net.cpp:527] res2a_branch2b/bn -> res2a_branch2b (in-place) I1106 16:38:07.246894 13573 net.cpp:260] Setting up res2a_branch2b/bn I1106 16:38:07.246899 13573 net.cpp:267] TEST Top shape for layer 14 'res2a_branch2b/bn' 8 64 80 192 (7864320) I1106 16:38:07.246906 13573 layer_factory.hpp:172] Creating layer 'res2a_branch2b/relu' of type 'ReLU' I1106 16:38:07.246913 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.246919 13573 net.cpp:200] Created Layer res2a_branch2b/relu (15) I1106 16:38:07.246922 13573 net.cpp:572] res2a_branch2b/relu <- res2a_branch2b I1106 16:38:07.246924 13573 net.cpp:527] res2a_branch2b/relu -> res2a_branch2b (in-place) I1106 16:38:07.246928 13573 net.cpp:260] Setting up res2a_branch2b/relu I1106 16:38:07.246932 13573 net.cpp:267] TEST Top shape for layer 15 'res2a_branch2b/relu' 8 64 80 192 (7864320) I1106 16:38:07.246940 13573 layer_factory.hpp:172] Creating layer 'pool2' of type 'Pooling' I1106 16:38:07.246946 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.246954 13573 net.cpp:200] Created Layer pool2 (16) I1106 16:38:07.246961 13573 net.cpp:572] pool2 <- res2a_branch2b I1106 16:38:07.246966 13573 net.cpp:542] pool2 -> pool2 I1106 16:38:07.246999 13573 net.cpp:260] Setting up pool2 I1106 16:38:07.247004 13573 net.cpp:267] TEST Top shape for layer 16 'pool2' 8 64 40 96 (1966080) I1106 16:38:07.247009 13573 layer_factory.hpp:172] Creating layer 'res3a_branch2a' of type 'Convolution' I1106 16:38:07.247011 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.247020 13573 net.cpp:200] Created Layer res3a_branch2a (17) I1106 16:38:07.247023 13573 net.cpp:572] res3a_branch2a <- pool2 I1106 16:38:07.247026 13573 net.cpp:542] res3a_branch2a -> res3a_branch2a I1106 16:38:07.248618 13573 net.cpp:260] Setting up res3a_branch2a I1106 16:38:07.248659 13573 net.cpp:267] TEST Top shape for layer 17 'res3a_branch2a' 8 128 40 96 (3932160) I1106 16:38:07.248674 13573 layer_factory.hpp:172] Creating layer 'res3a_branch2a/bn' of type 'BatchNorm' I1106 16:38:07.248683 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.248703 13573 net.cpp:200] Created Layer res3a_branch2a/bn (18) I1106 16:38:07.248711 13573 net.cpp:572] res3a_branch2a/bn <- res3a_branch2a I1106 16:38:07.248720 13573 net.cpp:527] res3a_branch2a/bn -> res3a_branch2a (in-place) I1106 16:38:07.248970 13573 net.cpp:260] Setting up res3a_branch2a/bn I1106 16:38:07.248980 13573 net.cpp:267] TEST Top shape for layer 18 'res3a_branch2a/bn' 8 128 40 96 (3932160) I1106 16:38:07.249008 13573 layer_factory.hpp:172] Creating layer 'res3a_branch2a/relu' of type 'ReLU' I1106 16:38:07.249032 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.249040 13573 net.cpp:200] Created Layer res3a_branch2a/relu (19) I1106 16:38:07.249047 13573 net.cpp:572] res3a_branch2a/relu <- res3a_branch2a I1106 16:38:07.249053 13573 net.cpp:527] res3a_branch2a/relu -> res3a_branch2a (in-place) I1106 16:38:07.249061 13573 net.cpp:260] Setting up res3a_branch2a/relu I1106 16:38:07.249068 13573 net.cpp:267] TEST Top shape for layer 19 'res3a_branch2a/relu' 8 128 40 96 (3932160) I1106 16:38:07.249073 13573 layer_factory.hpp:172] Creating layer 'res3a_branch2b' of type 'Convolution' I1106 16:38:07.249079 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.249095 13573 net.cpp:200] Created Layer res3a_branch2b (20) I1106 16:38:07.249102 13573 net.cpp:572] res3a_branch2b <- res3a_branch2a I1106 16:38:07.249107 13573 net.cpp:542] res3a_branch2b -> res3a_branch2b I1106 16:38:07.249573 13573 net.cpp:260] Setting up res3a_branch2b I1106 16:38:07.249584 13573 net.cpp:267] TEST Top shape for layer 20 'res3a_branch2b' 8 128 40 96 (3932160) I1106 16:38:07.249595 13573 layer_factory.hpp:172] Creating layer 'res3a_branch2b/bn' of type 'BatchNorm' I1106 16:38:07.249601 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.249611 13573 net.cpp:200] Created Layer res3a_branch2b/bn (21) I1106 16:38:07.249617 13573 net.cpp:572] res3a_branch2b/bn <- res3a_branch2b I1106 16:38:07.249624 13573 net.cpp:527] res3a_branch2b/bn -> res3a_branch2b (in-place) I1106 16:38:07.249838 13573 net.cpp:260] Setting up res3a_branch2b/bn I1106 16:38:07.249845 13573 net.cpp:267] TEST Top shape for layer 21 'res3a_branch2b/bn' 8 128 40 96 (3932160) I1106 16:38:07.249850 13573 layer_factory.hpp:172] Creating layer 'res3a_branch2b/relu' of type 'ReLU' I1106 16:38:07.249853 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.249857 13573 net.cpp:200] Created Layer res3a_branch2b/relu (22) I1106 16:38:07.249861 13573 net.cpp:572] res3a_branch2b/relu <- res3a_branch2b I1106 16:38:07.249863 13573 net.cpp:527] res3a_branch2b/relu -> res3a_branch2b (in-place) I1106 16:38:07.249869 13573 net.cpp:260] Setting up res3a_branch2b/relu I1106 16:38:07.249872 13573 net.cpp:267] TEST Top shape for layer 22 'res3a_branch2b/relu' 8 128 40 96 (3932160) I1106 16:38:07.249876 13573 layer_factory.hpp:172] Creating layer 'pool3' of type 'Pooling' I1106 16:38:07.249878 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.249887 13573 net.cpp:200] Created Layer pool3 (23) I1106 16:38:07.249891 13573 net.cpp:572] pool3 <- res3a_branch2b I1106 16:38:07.249892 13573 net.cpp:542] pool3 -> pool3 I1106 16:38:07.249930 13573 net.cpp:260] Setting up pool3 I1106 16:38:07.249935 13573 net.cpp:267] TEST Top shape for layer 23 'pool3' 8 128 20 48 (983040) I1106 16:38:07.249939 13573 layer_factory.hpp:172] Creating layer 'res4a_branch2a' of type 'Convolution' I1106 16:38:07.249941 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.249956 13573 net.cpp:200] Created Layer res4a_branch2a (24) I1106 16:38:07.249964 13573 net.cpp:572] res4a_branch2a <- pool3 I1106 16:38:07.249972 13573 net.cpp:542] res4a_branch2a -> res4a_branch2a I1106 16:38:07.252323 13573 net.cpp:260] Setting up res4a_branch2a I1106 16:38:07.252351 13573 net.cpp:267] TEST Top shape for layer 24 'res4a_branch2a' 8 256 20 48 (1966080) I1106 16:38:07.252362 13573 layer_factory.hpp:172] Creating layer 'res4a_branch2a/bn' of type 'BatchNorm' I1106 16:38:07.252368 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.252377 13573 net.cpp:200] Created Layer res4a_branch2a/bn (25) I1106 16:38:07.252383 13573 net.cpp:572] res4a_branch2a/bn <- res4a_branch2a I1106 16:38:07.252394 13573 net.cpp:527] res4a_branch2a/bn -> res4a_branch2a (in-place) I1106 16:38:07.252635 13573 net.cpp:260] Setting up res4a_branch2a/bn I1106 16:38:07.252645 13573 net.cpp:267] TEST Top shape for layer 25 'res4a_branch2a/bn' 8 256 20 48 (1966080) I1106 16:38:07.252653 13573 layer_factory.hpp:172] Creating layer 'res4a_branch2a/relu' of type 'ReLU' I1106 16:38:07.252660 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.252666 13573 net.cpp:200] Created Layer res4a_branch2a/relu (26) I1106 16:38:07.252672 13573 net.cpp:572] res4a_branch2a/relu <- res4a_branch2a I1106 16:38:07.252678 13573 net.cpp:527] res4a_branch2a/relu -> res4a_branch2a (in-place) I1106 16:38:07.252686 13573 net.cpp:260] Setting up res4a_branch2a/relu I1106 16:38:07.252691 13573 net.cpp:267] TEST Top shape for layer 26 'res4a_branch2a/relu' 8 256 20 48 (1966080) I1106 16:38:07.252697 13573 layer_factory.hpp:172] Creating layer 'res4a_branch2b' of type 'Convolution' I1106 16:38:07.252703 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.252715 13573 net.cpp:200] Created Layer res4a_branch2b (27) I1106 16:38:07.252722 13573 net.cpp:572] res4a_branch2b <- res4a_branch2a I1106 16:38:07.252727 13573 net.cpp:542] res4a_branch2b -> res4a_branch2b I1106 16:38:07.253974 13573 net.cpp:260] Setting up res4a_branch2b I1106 16:38:07.253988 13573 net.cpp:267] TEST Top shape for layer 27 'res4a_branch2b' 8 256 20 48 (1966080) I1106 16:38:07.253993 13573 layer_factory.hpp:172] Creating layer 'res4a_branch2b/bn' of type 'BatchNorm' I1106 16:38:07.253998 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.254006 13573 net.cpp:200] Created Layer res4a_branch2b/bn (28) I1106 16:38:07.254009 13573 net.cpp:572] res4a_branch2b/bn <- res4a_branch2b I1106 16:38:07.254011 13573 net.cpp:527] res4a_branch2b/bn -> res4a_branch2b (in-place) I1106 16:38:07.255472 13573 net.cpp:260] Setting up res4a_branch2b/bn I1106 16:38:07.255496 13573 net.cpp:267] TEST Top shape for layer 28 'res4a_branch2b/bn' 8 256 20 48 (1966080) I1106 16:38:07.255509 13573 layer_factory.hpp:172] Creating layer 'res4a_branch2b/relu' of type 'ReLU' I1106 16:38:07.255515 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.255524 13573 net.cpp:200] Created Layer res4a_branch2b/relu (29) I1106 16:38:07.255527 13573 net.cpp:572] res4a_branch2b/relu <- res4a_branch2b I1106 16:38:07.255532 13573 net.cpp:527] res4a_branch2b/relu -> res4a_branch2b (in-place) I1106 16:38:07.255539 13573 net.cpp:260] Setting up res4a_branch2b/relu I1106 16:38:07.255543 13573 net.cpp:267] TEST Top shape for layer 29 'res4a_branch2b/relu' 8 256 20 48 (1966080) I1106 16:38:07.255545 13573 layer_factory.hpp:172] Creating layer 'res4a_branch2b_res4a_branch2b/relu_0_split' of type 'Split' I1106 16:38:07.255549 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.255558 13573 net.cpp:200] Created Layer res4a_branch2b_res4a_branch2b/relu_0_split (30) I1106 16:38:07.255570 13573 net.cpp:572] res4a_branch2b_res4a_branch2b/relu_0_split <- res4a_branch2b I1106 16:38:07.255578 13573 net.cpp:542] res4a_branch2b_res4a_branch2b/relu_0_split -> res4a_branch2b_res4a_branch2b/relu_0_split_0 I1106 16:38:07.255586 13573 net.cpp:542] res4a_branch2b_res4a_branch2b/relu_0_split -> res4a_branch2b_res4a_branch2b/relu_0_split_1 I1106 16:38:07.255625 13573 net.cpp:260] Setting up res4a_branch2b_res4a_branch2b/relu_0_split I1106 16:38:07.255630 13573 net.cpp:267] TEST Top shape for layer 30 'res4a_branch2b_res4a_branch2b/relu_0_split' 8 256 20 48 (1966080) I1106 16:38:07.255633 13573 net.cpp:267] TEST Top shape for layer 30 'res4a_branch2b_res4a_branch2b/relu_0_split' 8 256 20 48 (1966080) I1106 16:38:07.255635 13573 layer_factory.hpp:172] Creating layer 'pool4' of type 'Pooling' I1106 16:38:07.255646 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.255671 13573 net.cpp:200] Created Layer pool4 (31) I1106 16:38:07.255756 13573 net.cpp:572] pool4 <- res4a_branch2b_res4a_branch2b/relu_0_split_0 I1106 16:38:07.255774 13573 net.cpp:542] pool4 -> pool4 I1106 16:38:07.255815 13573 net.cpp:260] Setting up pool4 I1106 16:38:07.255820 13573 net.cpp:267] TEST Top shape for layer 31 'pool4' 8 256 10 24 (491520) I1106 16:38:07.255829 13573 layer_factory.hpp:172] Creating layer 'res5a_branch2a' of type 'Convolution' I1106 16:38:07.255836 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.255856 13573 net.cpp:200] Created Layer res5a_branch2a (32) I1106 16:38:07.255861 13573 net.cpp:572] res5a_branch2a <- pool4 I1106 16:38:07.255863 13573 net.cpp:542] res5a_branch2a -> res5a_branch2a I1106 16:38:07.267757 13573 net.cpp:260] Setting up res5a_branch2a I1106 16:38:07.267879 13573 net.cpp:267] TEST Top shape for layer 32 'res5a_branch2a' 8 512 10 24 (983040) I1106 16:38:07.267900 13573 layer_factory.hpp:172] Creating layer 'res5a_branch2a/bn' of type 'BatchNorm' I1106 16:38:07.267913 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.267933 13573 net.cpp:200] Created Layer res5a_branch2a/bn (33) I1106 16:38:07.267940 13573 net.cpp:572] res5a_branch2a/bn <- res5a_branch2a I1106 16:38:07.267947 13573 net.cpp:527] res5a_branch2a/bn -> res5a_branch2a (in-place) I1106 16:38:07.268182 13573 net.cpp:260] Setting up res5a_branch2a/bn I1106 16:38:07.268188 13573 net.cpp:267] TEST Top shape for layer 33 'res5a_branch2a/bn' 8 512 10 24 (983040) I1106 16:38:07.268195 13573 layer_factory.hpp:172] Creating layer 'res5a_branch2a/relu' of type 'ReLU' I1106 16:38:07.268204 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.268213 13573 net.cpp:200] Created Layer res5a_branch2a/relu (34) I1106 16:38:07.268219 13573 net.cpp:572] res5a_branch2a/relu <- res5a_branch2a I1106 16:38:07.268224 13573 net.cpp:527] res5a_branch2a/relu -> res5a_branch2a (in-place) I1106 16:38:07.268234 13573 net.cpp:260] Setting up res5a_branch2a/relu I1106 16:38:07.268240 13573 net.cpp:267] TEST Top shape for layer 34 'res5a_branch2a/relu' 8 512 10 24 (983040) I1106 16:38:07.268245 13573 layer_factory.hpp:172] Creating layer 'res5a_branch2b' of type 'Convolution' I1106 16:38:07.268265 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.268280 13573 net.cpp:200] Created Layer res5a_branch2b (35) I1106 16:38:07.268287 13573 net.cpp:572] res5a_branch2b <- res5a_branch2a I1106 16:38:07.268293 13573 net.cpp:542] res5a_branch2b -> res5a_branch2b I1106 16:38:07.278038 13573 net.cpp:260] Setting up res5a_branch2b I1106 16:38:07.278081 13573 net.cpp:267] TEST Top shape for layer 35 'res5a_branch2b' 8 512 10 24 (983040) I1106 16:38:07.278100 13573 layer_factory.hpp:172] Creating layer 'res5a_branch2b/bn' of type 'BatchNorm' I1106 16:38:07.278110 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.278122 13573 net.cpp:200] Created Layer res5a_branch2b/bn (36) I1106 16:38:07.278131 13573 net.cpp:572] res5a_branch2b/bn <- res5a_branch2b I1106 16:38:07.278138 13573 net.cpp:527] res5a_branch2b/bn -> res5a_branch2b (in-place) I1106 16:38:07.278352 13573 net.cpp:260] Setting up res5a_branch2b/bn I1106 16:38:07.278362 13573 net.cpp:267] TEST Top shape for layer 36 'res5a_branch2b/bn' 8 512 10 24 (983040) I1106 16:38:07.278373 13573 layer_factory.hpp:172] Creating layer 'res5a_branch2b/relu' of type 'ReLU' I1106 16:38:07.278379 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.278388 13573 net.cpp:200] Created Layer res5a_branch2b/relu (37) I1106 16:38:07.278394 13573 net.cpp:572] res5a_branch2b/relu <- res5a_branch2b I1106 16:38:07.278400 13573 net.cpp:527] res5a_branch2b/relu -> res5a_branch2b (in-place) I1106 16:38:07.278409 13573 net.cpp:260] Setting up res5a_branch2b/relu I1106 16:38:07.278421 13573 net.cpp:267] TEST Top shape for layer 37 'res5a_branch2b/relu' 8 512 10 24 (983040) I1106 16:38:07.278439 13573 layer_factory.hpp:172] Creating layer 'res5a_branch2b_res5a_branch2b/relu_0_split' of type 'Split' I1106 16:38:07.278445 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.278453 13573 net.cpp:200] Created Layer res5a_branch2b_res5a_branch2b/relu_0_split (38) I1106 16:38:07.278460 13573 net.cpp:572] res5a_branch2b_res5a_branch2b/relu_0_split <- res5a_branch2b I1106 16:38:07.278466 13573 net.cpp:542] res5a_branch2b_res5a_branch2b/relu_0_split -> res5a_branch2b_res5a_branch2b/relu_0_split_0 I1106 16:38:07.278475 13573 net.cpp:542] res5a_branch2b_res5a_branch2b/relu_0_split -> res5a_branch2b_res5a_branch2b/relu_0_split_1 I1106 16:38:07.278503 13573 net.cpp:260] Setting up res5a_branch2b_res5a_branch2b/relu_0_split I1106 16:38:07.278513 13573 net.cpp:267] TEST Top shape for layer 38 'res5a_branch2b_res5a_branch2b/relu_0_split' 8 512 10 24 (983040) I1106 16:38:07.278519 13573 net.cpp:267] TEST Top shape for layer 38 'res5a_branch2b_res5a_branch2b/relu_0_split' 8 512 10 24 (983040) I1106 16:38:07.278525 13573 layer_factory.hpp:172] Creating layer 'pool6' of type 'Pooling' I1106 16:38:07.278532 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.278540 13573 net.cpp:200] Created Layer pool6 (39) I1106 16:38:07.278546 13573 net.cpp:572] pool6 <- res5a_branch2b_res5a_branch2b/relu_0_split_0 I1106 16:38:07.278553 13573 net.cpp:542] pool6 -> pool6 I1106 16:38:07.278586 13573 net.cpp:260] Setting up pool6 I1106 16:38:07.278596 13573 net.cpp:267] TEST Top shape for layer 39 'pool6' 8 512 5 12 (245760) I1106 16:38:07.278604 13573 layer_factory.hpp:172] Creating layer 'pool6_pool6_0_split' of type 'Split' I1106 16:38:07.278609 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.278616 13573 net.cpp:200] Created Layer pool6_pool6_0_split (40) I1106 16:38:07.278623 13573 net.cpp:572] pool6_pool6_0_split <- pool6 I1106 16:38:07.278630 13573 net.cpp:542] pool6_pool6_0_split -> pool6_pool6_0_split_0 I1106 16:38:07.278636 13573 net.cpp:542] pool6_pool6_0_split -> pool6_pool6_0_split_1 I1106 16:38:07.278661 13573 net.cpp:260] Setting up pool6_pool6_0_split I1106 16:38:07.278671 13573 net.cpp:267] TEST Top shape for layer 40 'pool6_pool6_0_split' 8 512 5 12 (245760) I1106 16:38:07.278677 13573 net.cpp:267] TEST Top shape for layer 40 'pool6_pool6_0_split' 8 512 5 12 (245760) I1106 16:38:07.278682 13573 layer_factory.hpp:172] Creating layer 'pool7' of type 'Pooling' I1106 16:38:07.278689 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.278697 13573 net.cpp:200] Created Layer pool7 (41) I1106 16:38:07.278702 13573 net.cpp:572] pool7 <- pool6_pool6_0_split_0 I1106 16:38:07.278708 13573 net.cpp:542] pool7 -> pool7 I1106 16:38:07.278738 13573 net.cpp:260] Setting up pool7 I1106 16:38:07.278748 13573 net.cpp:267] TEST Top shape for layer 41 'pool7' 8 512 3 6 (73728) I1106 16:38:07.278754 13573 layer_factory.hpp:172] Creating layer 'pool7_pool7_0_split' of type 'Split' I1106 16:38:07.278761 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.278769 13573 net.cpp:200] Created Layer pool7_pool7_0_split (42) I1106 16:38:07.278775 13573 net.cpp:572] pool7_pool7_0_split <- pool7 I1106 16:38:07.278781 13573 net.cpp:542] pool7_pool7_0_split -> pool7_pool7_0_split_0 I1106 16:38:07.278787 13573 net.cpp:542] pool7_pool7_0_split -> pool7_pool7_0_split_1 I1106 16:38:07.278810 13573 net.cpp:260] Setting up pool7_pool7_0_split I1106 16:38:07.278820 13573 net.cpp:267] TEST Top shape for layer 42 'pool7_pool7_0_split' 8 512 3 6 (73728) I1106 16:38:07.278826 13573 net.cpp:267] TEST Top shape for layer 42 'pool7_pool7_0_split' 8 512 3 6 (73728) I1106 16:38:07.278833 13573 layer_factory.hpp:172] Creating layer 'pool8' of type 'Pooling' I1106 16:38:07.278841 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.278854 13573 net.cpp:200] Created Layer pool8 (43) I1106 16:38:07.278861 13573 net.cpp:572] pool8 <- pool7_pool7_0_split_0 I1106 16:38:07.278867 13573 net.cpp:542] pool8 -> pool8 I1106 16:38:07.278901 13573 net.cpp:260] Setting up pool8 I1106 16:38:07.278910 13573 net.cpp:267] TEST Top shape for layer 43 'pool8' 8 512 2 3 (24576) I1106 16:38:07.278918 13573 layer_factory.hpp:172] Creating layer 'ctx_output1' of type 'Convolution' I1106 16:38:07.278923 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.278937 13573 net.cpp:200] Created Layer ctx_output1 (44) I1106 16:38:07.278945 13573 net.cpp:572] ctx_output1 <- res4a_branch2b_res4a_branch2b/relu_0_split_1 I1106 16:38:07.278952 13573 net.cpp:542] ctx_output1 -> ctx_output1 I1106 16:38:07.279554 13573 net.cpp:260] Setting up ctx_output1 I1106 16:38:07.279565 13573 net.cpp:267] TEST Top shape for layer 44 'ctx_output1' 8 256 20 48 (1966080) I1106 16:38:07.279573 13573 layer_factory.hpp:172] Creating layer 'ctx_output1/relu' of type 'ReLU' I1106 16:38:07.279579 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.279587 13573 net.cpp:200] Created Layer ctx_output1/relu (45) I1106 16:38:07.279593 13573 net.cpp:572] ctx_output1/relu <- ctx_output1 I1106 16:38:07.279599 13573 net.cpp:527] ctx_output1/relu -> ctx_output1 (in-place) I1106 16:38:07.279606 13573 net.cpp:260] Setting up ctx_output1/relu I1106 16:38:07.279613 13573 net.cpp:267] TEST Top shape for layer 45 'ctx_output1/relu' 8 256 20 48 (1966080) I1106 16:38:07.279618 13573 layer_factory.hpp:172] Creating layer 'ctx_output1_ctx_output1/relu_0_split' of type 'Split' I1106 16:38:07.279623 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.279630 13573 net.cpp:200] Created Layer ctx_output1_ctx_output1/relu_0_split (46) I1106 16:38:07.279635 13573 net.cpp:572] ctx_output1_ctx_output1/relu_0_split <- ctx_output1 I1106 16:38:07.279641 13573 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_0 I1106 16:38:07.279647 13573 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_1 I1106 16:38:07.279655 13573 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_2 I1106 16:38:07.279695 13573 net.cpp:260] Setting up ctx_output1_ctx_output1/relu_0_split I1106 16:38:07.279705 13573 net.cpp:267] TEST Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 8 256 20 48 (1966080) I1106 16:38:07.279711 13573 net.cpp:267] TEST Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 8 256 20 48 (1966080) I1106 16:38:07.279717 13573 net.cpp:267] TEST Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 8 256 20 48 (1966080) I1106 16:38:07.279722 13573 layer_factory.hpp:172] Creating layer 'ctx_output2' of type 'Convolution' I1106 16:38:07.279727 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.279740 13573 net.cpp:200] Created Layer ctx_output2 (47) I1106 16:38:07.279747 13573 net.cpp:572] ctx_output2 <- res5a_branch2b_res5a_branch2b/relu_0_split_1 I1106 16:38:07.279753 13573 net.cpp:542] ctx_output2 -> ctx_output2 I1106 16:38:07.280804 13573 net.cpp:260] Setting up ctx_output2 I1106 16:38:07.280817 13573 net.cpp:267] TEST Top shape for layer 47 'ctx_output2' 8 256 10 24 (491520) I1106 16:38:07.280825 13573 layer_factory.hpp:172] Creating layer 'ctx_output2/relu' of type 'ReLU' I1106 16:38:07.280833 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.280840 13573 net.cpp:200] Created Layer ctx_output2/relu (48) I1106 16:38:07.280848 13573 net.cpp:572] ctx_output2/relu <- ctx_output2 I1106 16:38:07.280854 13573 net.cpp:527] ctx_output2/relu -> ctx_output2 (in-place) I1106 16:38:07.280866 13573 net.cpp:260] Setting up ctx_output2/relu I1106 16:38:07.280874 13573 net.cpp:267] TEST Top shape for layer 48 'ctx_output2/relu' 8 256 10 24 (491520) I1106 16:38:07.280887 13573 layer_factory.hpp:172] Creating layer 'ctx_output2_ctx_output2/relu_0_split' of type 'Split' I1106 16:38:07.280894 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.280902 13573 net.cpp:200] Created Layer ctx_output2_ctx_output2/relu_0_split (49) I1106 16:38:07.280910 13573 net.cpp:572] ctx_output2_ctx_output2/relu_0_split <- ctx_output2 I1106 16:38:07.280915 13573 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_0 I1106 16:38:07.280921 13573 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_1 I1106 16:38:07.280928 13573 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_2 I1106 16:38:07.280966 13573 net.cpp:260] Setting up ctx_output2_ctx_output2/relu_0_split I1106 16:38:07.280977 13573 net.cpp:267] TEST Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 8 256 10 24 (491520) I1106 16:38:07.280983 13573 net.cpp:267] TEST Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 8 256 10 24 (491520) I1106 16:38:07.280990 13573 net.cpp:267] TEST Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 8 256 10 24 (491520) I1106 16:38:07.280997 13573 layer_factory.hpp:172] Creating layer 'ctx_output3' of type 'Convolution' I1106 16:38:07.281003 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.281018 13573 net.cpp:200] Created Layer ctx_output3 (50) I1106 16:38:07.281025 13573 net.cpp:572] ctx_output3 <- pool6_pool6_0_split_1 I1106 16:38:07.281033 13573 net.cpp:542] ctx_output3 -> ctx_output3 I1106 16:38:07.284044 13573 net.cpp:260] Setting up ctx_output3 I1106 16:38:07.284080 13573 net.cpp:267] TEST Top shape for layer 50 'ctx_output3' 8 256 5 12 (122880) I1106 16:38:07.284090 13573 layer_factory.hpp:172] Creating layer 'ctx_output3/relu' of type 'ReLU' I1106 16:38:07.284097 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.284108 13573 net.cpp:200] Created Layer ctx_output3/relu (51) I1106 16:38:07.284116 13573 net.cpp:572] ctx_output3/relu <- ctx_output3 I1106 16:38:07.284123 13573 net.cpp:527] ctx_output3/relu -> ctx_output3 (in-place) I1106 16:38:07.284132 13573 net.cpp:260] Setting up ctx_output3/relu I1106 16:38:07.284138 13573 net.cpp:267] TEST Top shape for layer 51 'ctx_output3/relu' 8 256 5 12 (122880) I1106 16:38:07.284143 13573 layer_factory.hpp:172] Creating layer 'ctx_output3_ctx_output3/relu_0_split' of type 'Split' I1106 16:38:07.284150 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.284158 13573 net.cpp:200] Created Layer ctx_output3_ctx_output3/relu_0_split (52) I1106 16:38:07.284164 13573 net.cpp:572] ctx_output3_ctx_output3/relu_0_split <- ctx_output3 I1106 16:38:07.284170 13573 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_0 I1106 16:38:07.284178 13573 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_1 I1106 16:38:07.284188 13573 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_2 I1106 16:38:07.284227 13573 net.cpp:260] Setting up ctx_output3_ctx_output3/relu_0_split I1106 16:38:07.284236 13573 net.cpp:267] TEST Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 8 256 5 12 (122880) I1106 16:38:07.284243 13573 net.cpp:267] TEST Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 8 256 5 12 (122880) I1106 16:38:07.284250 13573 net.cpp:267] TEST Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 8 256 5 12 (122880) I1106 16:38:07.284255 13573 layer_factory.hpp:172] Creating layer 'ctx_output4' of type 'Convolution' I1106 16:38:07.284261 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.284281 13573 net.cpp:200] Created Layer ctx_output4 (53) I1106 16:38:07.284297 13573 net.cpp:572] ctx_output4 <- pool7_pool7_0_split_1 I1106 16:38:07.284305 13573 net.cpp:542] ctx_output4 -> ctx_output4 I1106 16:38:07.285369 13573 net.cpp:260] Setting up ctx_output4 I1106 16:38:07.285375 13573 net.cpp:267] TEST Top shape for layer 53 'ctx_output4' 8 256 3 6 (36864) I1106 16:38:07.285380 13573 layer_factory.hpp:172] Creating layer 'ctx_output4/relu' of type 'ReLU' I1106 16:38:07.285382 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.285408 13573 net.cpp:200] Created Layer ctx_output4/relu (54) I1106 16:38:07.285423 13573 net.cpp:572] ctx_output4/relu <- ctx_output4 I1106 16:38:07.285437 13573 net.cpp:527] ctx_output4/relu -> ctx_output4 (in-place) I1106 16:38:07.285454 13573 net.cpp:260] Setting up ctx_output4/relu I1106 16:38:07.285470 13573 net.cpp:267] TEST Top shape for layer 54 'ctx_output4/relu' 8 256 3 6 (36864) I1106 16:38:07.285482 13573 layer_factory.hpp:172] Creating layer 'ctx_output4_ctx_output4/relu_0_split' of type 'Split' I1106 16:38:07.285495 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.285507 13573 net.cpp:200] Created Layer ctx_output4_ctx_output4/relu_0_split (55) I1106 16:38:07.285519 13573 net.cpp:572] ctx_output4_ctx_output4/relu_0_split <- ctx_output4 I1106 16:38:07.285531 13573 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_0 I1106 16:38:07.285545 13573 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_1 I1106 16:38:07.285559 13573 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_2 I1106 16:38:07.285609 13573 net.cpp:260] Setting up ctx_output4_ctx_output4/relu_0_split I1106 16:38:07.285614 13573 net.cpp:267] TEST Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 8 256 3 6 (36864) I1106 16:38:07.285617 13573 net.cpp:267] TEST Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 8 256 3 6 (36864) I1106 16:38:07.285619 13573 net.cpp:267] TEST Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 8 256 3 6 (36864) I1106 16:38:07.285622 13573 layer_factory.hpp:172] Creating layer 'ctx_output5' of type 'Convolution' I1106 16:38:07.285624 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.285634 13573 net.cpp:200] Created Layer ctx_output5 (56) I1106 16:38:07.285636 13573 net.cpp:572] ctx_output5 <- pool8 I1106 16:38:07.285640 13573 net.cpp:542] ctx_output5 -> ctx_output5 I1106 16:38:07.287210 13573 net.cpp:260] Setting up ctx_output5 I1106 16:38:07.287221 13573 net.cpp:267] TEST Top shape for layer 56 'ctx_output5' 8 256 2 3 (12288) I1106 16:38:07.287226 13573 layer_factory.hpp:172] Creating layer 'ctx_output5/relu' of type 'ReLU' I1106 16:38:07.287228 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.287233 13573 net.cpp:200] Created Layer ctx_output5/relu (57) I1106 16:38:07.287235 13573 net.cpp:572] ctx_output5/relu <- ctx_output5 I1106 16:38:07.287240 13573 net.cpp:527] ctx_output5/relu -> ctx_output5 (in-place) I1106 16:38:07.287243 13573 net.cpp:260] Setting up ctx_output5/relu I1106 16:38:07.287245 13573 net.cpp:267] TEST Top shape for layer 57 'ctx_output5/relu' 8 256 2 3 (12288) I1106 16:38:07.287247 13573 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc' of type 'Convolution' I1106 16:38:07.287251 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.287261 13573 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc (58) I1106 16:38:07.287264 13573 net.cpp:572] ctx_output1/relu_mbox_loc <- ctx_output1_ctx_output1/relu_0_split_0 I1106 16:38:07.287268 13573 net.cpp:542] ctx_output1/relu_mbox_loc -> ctx_output1/relu_mbox_loc I1106 16:38:07.287458 13573 net.cpp:260] Setting up ctx_output1/relu_mbox_loc I1106 16:38:07.287464 13573 net.cpp:267] TEST Top shape for layer 58 'ctx_output1/relu_mbox_loc' 8 16 20 48 (122880) I1106 16:38:07.287477 13573 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:07.287480 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.287488 13573 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc_perm (59) I1106 16:38:07.287497 13573 net.cpp:572] ctx_output1/relu_mbox_loc_perm <- ctx_output1/relu_mbox_loc I1106 16:38:07.287501 13573 net.cpp:542] ctx_output1/relu_mbox_loc_perm -> ctx_output1/relu_mbox_loc_perm I1106 16:38:07.287560 13573 net.cpp:260] Setting up ctx_output1/relu_mbox_loc_perm I1106 16:38:07.287565 13573 net.cpp:267] TEST Top shape for layer 59 'ctx_output1/relu_mbox_loc_perm' 8 20 48 16 (122880) I1106 16:38:07.287569 13573 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:07.287571 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.287580 13573 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc_flat (60) I1106 16:38:07.287585 13573 net.cpp:572] ctx_output1/relu_mbox_loc_flat <- ctx_output1/relu_mbox_loc_perm I1106 16:38:07.287587 13573 net.cpp:542] ctx_output1/relu_mbox_loc_flat -> ctx_output1/relu_mbox_loc_flat I1106 16:38:07.287654 13573 net.cpp:260] Setting up ctx_output1/relu_mbox_loc_flat I1106 16:38:07.287659 13573 net.cpp:267] TEST Top shape for layer 60 'ctx_output1/relu_mbox_loc_flat' 8 15360 (122880) I1106 16:38:07.287662 13573 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf' of type 'Convolution' I1106 16:38:07.287665 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.287674 13573 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf (61) I1106 16:38:07.287676 13573 net.cpp:572] ctx_output1/relu_mbox_conf <- ctx_output1_ctx_output1/relu_0_split_1 I1106 16:38:07.287689 13573 net.cpp:542] ctx_output1/relu_mbox_conf -> ctx_output1/relu_mbox_conf I1106 16:38:07.287848 13573 net.cpp:260] Setting up ctx_output1/relu_mbox_conf I1106 16:38:07.287854 13573 net.cpp:267] TEST Top shape for layer 61 'ctx_output1/relu_mbox_conf' 8 8 20 48 (61440) I1106 16:38:07.287859 13573 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:07.287863 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.287868 13573 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf_perm (62) I1106 16:38:07.287870 13573 net.cpp:572] ctx_output1/relu_mbox_conf_perm <- ctx_output1/relu_mbox_conf I1106 16:38:07.287873 13573 net.cpp:542] ctx_output1/relu_mbox_conf_perm -> ctx_output1/relu_mbox_conf_perm I1106 16:38:07.287930 13573 net.cpp:260] Setting up ctx_output1/relu_mbox_conf_perm I1106 16:38:07.287933 13573 net.cpp:267] TEST Top shape for layer 62 'ctx_output1/relu_mbox_conf_perm' 8 20 48 8 (61440) I1106 16:38:07.287935 13573 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:07.287938 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.287942 13573 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf_flat (63) I1106 16:38:07.287945 13573 net.cpp:572] ctx_output1/relu_mbox_conf_flat <- ctx_output1/relu_mbox_conf_perm I1106 16:38:07.287947 13573 net.cpp:542] ctx_output1/relu_mbox_conf_flat -> ctx_output1/relu_mbox_conf_flat I1106 16:38:07.287988 13573 net.cpp:260] Setting up ctx_output1/relu_mbox_conf_flat I1106 16:38:07.287993 13573 net.cpp:267] TEST Top shape for layer 63 'ctx_output1/relu_mbox_conf_flat' 8 7680 (61440) I1106 16:38:07.287995 13573 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:07.287998 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.288007 13573 net.cpp:200] Created Layer ctx_output1/relu_mbox_priorbox (64) I1106 16:38:07.288010 13573 net.cpp:572] ctx_output1/relu_mbox_priorbox <- ctx_output1_ctx_output1/relu_0_split_2 I1106 16:38:07.288020 13573 net.cpp:572] ctx_output1/relu_mbox_priorbox <- data_data_0_split_1 I1106 16:38:07.288031 13573 net.cpp:542] ctx_output1/relu_mbox_priorbox -> ctx_output1/relu_mbox_priorbox I1106 16:38:07.288049 13573 net.cpp:260] Setting up ctx_output1/relu_mbox_priorbox I1106 16:38:07.288053 13573 net.cpp:267] TEST Top shape for layer 64 'ctx_output1/relu_mbox_priorbox' 1 2 15360 (30720) I1106 16:38:07.288061 13573 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc' of type 'Convolution' I1106 16:38:07.288066 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.288079 13573 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc (65) I1106 16:38:07.288085 13573 net.cpp:572] ctx_output2/relu_mbox_loc <- ctx_output2_ctx_output2/relu_0_split_0 I1106 16:38:07.288092 13573 net.cpp:542] ctx_output2/relu_mbox_loc -> ctx_output2/relu_mbox_loc I1106 16:38:07.288285 13573 net.cpp:260] Setting up ctx_output2/relu_mbox_loc I1106 16:38:07.288290 13573 net.cpp:267] TEST Top shape for layer 65 'ctx_output2/relu_mbox_loc' 8 24 10 24 (46080) I1106 16:38:07.288295 13573 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:07.288297 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.288303 13573 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc_perm (66) I1106 16:38:07.288305 13573 net.cpp:572] ctx_output2/relu_mbox_loc_perm <- ctx_output2/relu_mbox_loc I1106 16:38:07.288308 13573 net.cpp:542] ctx_output2/relu_mbox_loc_perm -> ctx_output2/relu_mbox_loc_perm I1106 16:38:07.288365 13573 net.cpp:260] Setting up ctx_output2/relu_mbox_loc_perm I1106 16:38:07.288370 13573 net.cpp:267] TEST Top shape for layer 66 'ctx_output2/relu_mbox_loc_perm' 8 10 24 24 (46080) I1106 16:38:07.288372 13573 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:07.288381 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.288388 13573 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc_flat (67) I1106 16:38:07.288394 13573 net.cpp:572] ctx_output2/relu_mbox_loc_flat <- ctx_output2/relu_mbox_loc_perm I1106 16:38:07.288401 13573 net.cpp:542] ctx_output2/relu_mbox_loc_flat -> ctx_output2/relu_mbox_loc_flat I1106 16:38:07.288444 13573 net.cpp:260] Setting up ctx_output2/relu_mbox_loc_flat I1106 16:38:07.288448 13573 net.cpp:267] TEST Top shape for layer 67 'ctx_output2/relu_mbox_loc_flat' 8 5760 (46080) I1106 16:38:07.288451 13573 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf' of type 'Convolution' I1106 16:38:07.288453 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.288462 13573 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf (68) I1106 16:38:07.288471 13573 net.cpp:572] ctx_output2/relu_mbox_conf <- ctx_output2_ctx_output2/relu_0_split_1 I1106 16:38:07.288478 13573 net.cpp:542] ctx_output2/relu_mbox_conf -> ctx_output2/relu_mbox_conf I1106 16:38:07.288661 13573 net.cpp:260] Setting up ctx_output2/relu_mbox_conf I1106 16:38:07.288667 13573 net.cpp:267] TEST Top shape for layer 68 'ctx_output2/relu_mbox_conf' 8 12 10 24 (23040) I1106 16:38:07.288679 13573 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:07.288682 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.288688 13573 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf_perm (69) I1106 16:38:07.288695 13573 net.cpp:572] ctx_output2/relu_mbox_conf_perm <- ctx_output2/relu_mbox_conf I1106 16:38:07.288702 13573 net.cpp:542] ctx_output2/relu_mbox_conf_perm -> ctx_output2/relu_mbox_conf_perm I1106 16:38:07.288758 13573 net.cpp:260] Setting up ctx_output2/relu_mbox_conf_perm I1106 16:38:07.288761 13573 net.cpp:267] TEST Top shape for layer 69 'ctx_output2/relu_mbox_conf_perm' 8 10 24 12 (23040) I1106 16:38:07.288770 13573 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:07.288779 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.288785 13573 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf_flat (70) I1106 16:38:07.288791 13573 net.cpp:572] ctx_output2/relu_mbox_conf_flat <- ctx_output2/relu_mbox_conf_perm I1106 16:38:07.288799 13573 net.cpp:542] ctx_output2/relu_mbox_conf_flat -> ctx_output2/relu_mbox_conf_flat I1106 16:38:07.288837 13573 net.cpp:260] Setting up ctx_output2/relu_mbox_conf_flat I1106 16:38:07.288842 13573 net.cpp:267] TEST Top shape for layer 70 'ctx_output2/relu_mbox_conf_flat' 8 2880 (23040) I1106 16:38:07.288846 13573 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:07.288847 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.288852 13573 net.cpp:200] Created Layer ctx_output2/relu_mbox_priorbox (71) I1106 16:38:07.288856 13573 net.cpp:572] ctx_output2/relu_mbox_priorbox <- ctx_output2_ctx_output2/relu_0_split_2 I1106 16:38:07.288858 13573 net.cpp:572] ctx_output2/relu_mbox_priorbox <- data_data_0_split_2 I1106 16:38:07.288866 13573 net.cpp:542] ctx_output2/relu_mbox_priorbox -> ctx_output2/relu_mbox_priorbox I1106 16:38:07.288882 13573 net.cpp:260] Setting up ctx_output2/relu_mbox_priorbox I1106 16:38:07.288887 13573 net.cpp:267] TEST Top shape for layer 71 'ctx_output2/relu_mbox_priorbox' 1 2 5760 (11520) I1106 16:38:07.288890 13573 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc' of type 'Convolution' I1106 16:38:07.288892 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.288902 13573 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc (72) I1106 16:38:07.288910 13573 net.cpp:572] ctx_output3/relu_mbox_loc <- ctx_output3_ctx_output3/relu_0_split_0 I1106 16:38:07.288916 13573 net.cpp:542] ctx_output3/relu_mbox_loc -> ctx_output3/relu_mbox_loc I1106 16:38:07.289104 13573 net.cpp:260] Setting up ctx_output3/relu_mbox_loc I1106 16:38:07.289110 13573 net.cpp:267] TEST Top shape for layer 72 'ctx_output3/relu_mbox_loc' 8 24 5 12 (11520) I1106 16:38:07.289115 13573 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:07.289124 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.289131 13573 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc_perm (73) I1106 16:38:07.289135 13573 net.cpp:572] ctx_output3/relu_mbox_loc_perm <- ctx_output3/relu_mbox_loc I1106 16:38:07.289139 13573 net.cpp:542] ctx_output3/relu_mbox_loc_perm -> ctx_output3/relu_mbox_loc_perm I1106 16:38:07.289201 13573 net.cpp:260] Setting up ctx_output3/relu_mbox_loc_perm I1106 16:38:07.289206 13573 net.cpp:267] TEST Top shape for layer 73 'ctx_output3/relu_mbox_loc_perm' 8 5 12 24 (11520) I1106 16:38:07.289209 13573 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:07.289211 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.289214 13573 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc_flat (74) I1106 16:38:07.289222 13573 net.cpp:572] ctx_output3/relu_mbox_loc_flat <- ctx_output3/relu_mbox_loc_perm I1106 16:38:07.289228 13573 net.cpp:542] ctx_output3/relu_mbox_loc_flat -> ctx_output3/relu_mbox_loc_flat I1106 16:38:07.289268 13573 net.cpp:260] Setting up ctx_output3/relu_mbox_loc_flat I1106 16:38:07.289273 13573 net.cpp:267] TEST Top shape for layer 74 'ctx_output3/relu_mbox_loc_flat' 8 1440 (11520) I1106 16:38:07.289275 13573 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf' of type 'Convolution' I1106 16:38:07.289278 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.289294 13573 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf (75) I1106 16:38:07.289302 13573 net.cpp:572] ctx_output3/relu_mbox_conf <- ctx_output3_ctx_output3/relu_0_split_1 I1106 16:38:07.289305 13573 net.cpp:542] ctx_output3/relu_mbox_conf -> ctx_output3/relu_mbox_conf I1106 16:38:07.289464 13573 net.cpp:260] Setting up ctx_output3/relu_mbox_conf I1106 16:38:07.289470 13573 net.cpp:267] TEST Top shape for layer 75 'ctx_output3/relu_mbox_conf' 8 12 5 12 (5760) I1106 16:38:07.289481 13573 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:07.289485 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.289490 13573 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf_perm (76) I1106 16:38:07.289494 13573 net.cpp:572] ctx_output3/relu_mbox_conf_perm <- ctx_output3/relu_mbox_conf I1106 16:38:07.289497 13573 net.cpp:542] ctx_output3/relu_mbox_conf_perm -> ctx_output3/relu_mbox_conf_perm I1106 16:38:07.289558 13573 net.cpp:260] Setting up ctx_output3/relu_mbox_conf_perm I1106 16:38:07.289562 13573 net.cpp:267] TEST Top shape for layer 76 'ctx_output3/relu_mbox_conf_perm' 8 5 12 12 (5760) I1106 16:38:07.289566 13573 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:07.289568 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.289572 13573 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf_flat (77) I1106 16:38:07.289578 13573 net.cpp:572] ctx_output3/relu_mbox_conf_flat <- ctx_output3/relu_mbox_conf_perm I1106 16:38:07.289585 13573 net.cpp:542] ctx_output3/relu_mbox_conf_flat -> ctx_output3/relu_mbox_conf_flat I1106 16:38:07.289621 13573 net.cpp:260] Setting up ctx_output3/relu_mbox_conf_flat I1106 16:38:07.289626 13573 net.cpp:267] TEST Top shape for layer 77 'ctx_output3/relu_mbox_conf_flat' 8 720 (5760) I1106 16:38:07.289629 13573 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:07.289633 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.289641 13573 net.cpp:200] Created Layer ctx_output3/relu_mbox_priorbox (78) I1106 16:38:07.289645 13573 net.cpp:572] ctx_output3/relu_mbox_priorbox <- ctx_output3_ctx_output3/relu_0_split_2 I1106 16:38:07.289649 13573 net.cpp:572] ctx_output3/relu_mbox_priorbox <- data_data_0_split_3 I1106 16:38:07.289652 13573 net.cpp:542] ctx_output3/relu_mbox_priorbox -> ctx_output3/relu_mbox_priorbox I1106 16:38:07.289667 13573 net.cpp:260] Setting up ctx_output3/relu_mbox_priorbox I1106 16:38:07.289671 13573 net.cpp:267] TEST Top shape for layer 78 'ctx_output3/relu_mbox_priorbox' 1 2 1440 (2880) I1106 16:38:07.289674 13573 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc' of type 'Convolution' I1106 16:38:07.289676 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.289683 13573 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc (79) I1106 16:38:07.289691 13573 net.cpp:572] ctx_output4/relu_mbox_loc <- ctx_output4_ctx_output4/relu_0_split_0 I1106 16:38:07.289698 13573 net.cpp:542] ctx_output4/relu_mbox_loc -> ctx_output4/relu_mbox_loc I1106 16:38:07.289872 13573 net.cpp:260] Setting up ctx_output4/relu_mbox_loc I1106 16:38:07.289878 13573 net.cpp:267] TEST Top shape for layer 79 'ctx_output4/relu_mbox_loc' 8 16 3 6 (2304) I1106 16:38:07.289883 13573 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:07.289891 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.289901 13573 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc_perm (80) I1106 16:38:07.289904 13573 net.cpp:572] ctx_output4/relu_mbox_loc_perm <- ctx_output4/relu_mbox_loc I1106 16:38:07.289906 13573 net.cpp:542] ctx_output4/relu_mbox_loc_perm -> ctx_output4/relu_mbox_loc_perm I1106 16:38:07.289968 13573 net.cpp:260] Setting up ctx_output4/relu_mbox_loc_perm I1106 16:38:07.289978 13573 net.cpp:267] TEST Top shape for layer 80 'ctx_output4/relu_mbox_loc_perm' 8 3 6 16 (2304) I1106 16:38:07.289980 13573 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:07.289983 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.289988 13573 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc_flat (81) I1106 16:38:07.289994 13573 net.cpp:572] ctx_output4/relu_mbox_loc_flat <- ctx_output4/relu_mbox_loc_perm I1106 16:38:07.290000 13573 net.cpp:542] ctx_output4/relu_mbox_loc_flat -> ctx_output4/relu_mbox_loc_flat I1106 16:38:07.290038 13573 net.cpp:260] Setting up ctx_output4/relu_mbox_loc_flat I1106 16:38:07.290047 13573 net.cpp:267] TEST Top shape for layer 81 'ctx_output4/relu_mbox_loc_flat' 8 288 (2304) I1106 16:38:07.290053 13573 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf' of type 'Convolution' I1106 16:38:07.290060 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.290071 13573 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf (82) I1106 16:38:07.290077 13573 net.cpp:572] ctx_output4/relu_mbox_conf <- ctx_output4_ctx_output4/relu_0_split_1 I1106 16:38:07.290083 13573 net.cpp:542] ctx_output4/relu_mbox_conf -> ctx_output4/relu_mbox_conf I1106 16:38:07.290241 13573 net.cpp:260] Setting up ctx_output4/relu_mbox_conf I1106 16:38:07.290251 13573 net.cpp:267] TEST Top shape for layer 82 'ctx_output4/relu_mbox_conf' 8 8 3 6 (1152) I1106 16:38:07.290259 13573 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:07.290266 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.290273 13573 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf_perm (83) I1106 16:38:07.290279 13573 net.cpp:572] ctx_output4/relu_mbox_conf_perm <- ctx_output4/relu_mbox_conf I1106 16:38:07.290285 13573 net.cpp:542] ctx_output4/relu_mbox_conf_perm -> ctx_output4/relu_mbox_conf_perm I1106 16:38:07.290343 13573 net.cpp:260] Setting up ctx_output4/relu_mbox_conf_perm I1106 16:38:07.290351 13573 net.cpp:267] TEST Top shape for layer 83 'ctx_output4/relu_mbox_conf_perm' 8 3 6 8 (1152) I1106 16:38:07.290357 13573 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:07.290362 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.290369 13573 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf_flat (84) I1106 16:38:07.290375 13573 net.cpp:572] ctx_output4/relu_mbox_conf_flat <- ctx_output4/relu_mbox_conf_perm I1106 16:38:07.290380 13573 net.cpp:542] ctx_output4/relu_mbox_conf_flat -> ctx_output4/relu_mbox_conf_flat I1106 16:38:07.290417 13573 net.cpp:260] Setting up ctx_output4/relu_mbox_conf_flat I1106 16:38:07.290426 13573 net.cpp:267] TEST Top shape for layer 84 'ctx_output4/relu_mbox_conf_flat' 8 144 (1152) I1106 16:38:07.290431 13573 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:07.290436 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.290444 13573 net.cpp:200] Created Layer ctx_output4/relu_mbox_priorbox (85) I1106 16:38:07.290450 13573 net.cpp:572] ctx_output4/relu_mbox_priorbox <- ctx_output4_ctx_output4/relu_0_split_2 I1106 16:38:07.290457 13573 net.cpp:572] ctx_output4/relu_mbox_priorbox <- data_data_0_split_4 I1106 16:38:07.290462 13573 net.cpp:542] ctx_output4/relu_mbox_priorbox -> ctx_output4/relu_mbox_priorbox I1106 16:38:07.290477 13573 net.cpp:260] Setting up ctx_output4/relu_mbox_priorbox I1106 16:38:07.290485 13573 net.cpp:267] TEST Top shape for layer 85 'ctx_output4/relu_mbox_priorbox' 1 2 288 (576) I1106 16:38:07.290490 13573 layer_factory.hpp:172] Creating layer 'mbox_loc' of type 'Concat' I1106 16:38:07.290498 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.290510 13573 net.cpp:200] Created Layer mbox_loc (86) I1106 16:38:07.290516 13573 net.cpp:572] mbox_loc <- ctx_output1/relu_mbox_loc_flat I1106 16:38:07.290522 13573 net.cpp:572] mbox_loc <- ctx_output2/relu_mbox_loc_flat I1106 16:38:07.290529 13573 net.cpp:572] mbox_loc <- ctx_output3/relu_mbox_loc_flat I1106 16:38:07.290534 13573 net.cpp:572] mbox_loc <- ctx_output4/relu_mbox_loc_flat I1106 16:38:07.290540 13573 net.cpp:542] mbox_loc -> mbox_loc I1106 16:38:07.290558 13573 net.cpp:260] Setting up mbox_loc I1106 16:38:07.290566 13573 net.cpp:267] TEST Top shape for layer 86 'mbox_loc' 8 22848 (182784) I1106 16:38:07.290572 13573 layer_factory.hpp:172] Creating layer 'mbox_conf' of type 'Concat' I1106 16:38:07.290577 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.290583 13573 net.cpp:200] Created Layer mbox_conf (87) I1106 16:38:07.290590 13573 net.cpp:572] mbox_conf <- ctx_output1/relu_mbox_conf_flat I1106 16:38:07.290594 13573 net.cpp:572] mbox_conf <- ctx_output2/relu_mbox_conf_flat I1106 16:38:07.290601 13573 net.cpp:572] mbox_conf <- ctx_output3/relu_mbox_conf_flat I1106 16:38:07.290607 13573 net.cpp:572] mbox_conf <- ctx_output4/relu_mbox_conf_flat I1106 16:38:07.290612 13573 net.cpp:542] mbox_conf -> mbox_conf I1106 16:38:07.290629 13573 net.cpp:260] Setting up mbox_conf I1106 16:38:07.290637 13573 net.cpp:267] TEST Top shape for layer 87 'mbox_conf' 8 11424 (91392) I1106 16:38:07.290643 13573 layer_factory.hpp:172] Creating layer 'mbox_priorbox' of type 'Concat' I1106 16:38:07.290647 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.290653 13573 net.cpp:200] Created Layer mbox_priorbox (88) I1106 16:38:07.290659 13573 net.cpp:572] mbox_priorbox <- ctx_output1/relu_mbox_priorbox I1106 16:38:07.290665 13573 net.cpp:572] mbox_priorbox <- ctx_output2/relu_mbox_priorbox I1106 16:38:07.290670 13573 net.cpp:572] mbox_priorbox <- ctx_output3/relu_mbox_priorbox I1106 16:38:07.290676 13573 net.cpp:572] mbox_priorbox <- ctx_output4/relu_mbox_priorbox I1106 16:38:07.290681 13573 net.cpp:542] mbox_priorbox -> mbox_priorbox I1106 16:38:07.290699 13573 net.cpp:260] Setting up mbox_priorbox I1106 16:38:07.290706 13573 net.cpp:267] TEST Top shape for layer 88 'mbox_priorbox' 1 2 22848 (45696) I1106 16:38:07.290711 13573 layer_factory.hpp:172] Creating layer 'mbox_conf_reshape' of type 'Reshape' I1106 16:38:07.290717 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.290727 13573 net.cpp:200] Created Layer mbox_conf_reshape (89) I1106 16:38:07.290733 13573 net.cpp:572] mbox_conf_reshape <- mbox_conf I1106 16:38:07.290738 13573 net.cpp:542] mbox_conf_reshape -> mbox_conf_reshape I1106 16:38:07.290760 13573 net.cpp:260] Setting up mbox_conf_reshape I1106 16:38:07.290767 13573 net.cpp:267] TEST Top shape for layer 89 'mbox_conf_reshape' 8 5712 2 (91392) I1106 16:38:07.290773 13573 layer_factory.hpp:172] Creating layer 'mbox_conf_softmax' of type 'Softmax' I1106 16:38:07.290778 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.290791 13573 net.cpp:200] Created Layer mbox_conf_softmax (90) I1106 16:38:07.290797 13573 net.cpp:572] mbox_conf_softmax <- mbox_conf_reshape I1106 16:38:07.290803 13573 net.cpp:542] mbox_conf_softmax -> mbox_conf_softmax I1106 16:38:07.290841 13573 net.cpp:260] Setting up mbox_conf_softmax I1106 16:38:07.290849 13573 net.cpp:267] TEST Top shape for layer 90 'mbox_conf_softmax' 8 5712 2 (91392) I1106 16:38:07.290854 13573 layer_factory.hpp:172] Creating layer 'mbox_conf_flatten' of type 'Flatten' I1106 16:38:07.290860 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.290866 13573 net.cpp:200] Created Layer mbox_conf_flatten (91) I1106 16:38:07.290872 13573 net.cpp:572] mbox_conf_flatten <- mbox_conf_softmax I1106 16:38:07.290880 13573 net.cpp:542] mbox_conf_flatten -> mbox_conf_flatten I1106 16:38:07.291433 13573 net.cpp:260] Setting up mbox_conf_flatten I1106 16:38:07.291447 13573 net.cpp:267] TEST Top shape for layer 91 'mbox_conf_flatten' 8 11424 (91392) I1106 16:38:07.291455 13573 layer_factory.hpp:172] Creating layer 'detection_out' of type 'DetectionOutput' I1106 16:38:07.291460 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.291479 13573 net.cpp:200] Created Layer detection_out (92) I1106 16:38:07.291486 13573 net.cpp:572] detection_out <- mbox_loc I1106 16:38:07.291492 13573 net.cpp:572] detection_out <- mbox_conf_flatten I1106 16:38:07.291497 13573 net.cpp:572] detection_out <- mbox_priorbox I1106 16:38:07.291503 13573 net.cpp:542] detection_out -> detection_out I1106 16:38:07.291606 13573 net.cpp:260] Setting up detection_out I1106 16:38:07.291617 13573 net.cpp:267] TEST Top shape for layer 92 'detection_out' 1 1 1 7 (7) I1106 16:38:07.291622 13573 layer_factory.hpp:172] Creating layer 'detection_eval' of type 'DetectionEvaluate' I1106 16:38:07.291628 13573 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.291636 13573 net.cpp:200] Created Layer detection_eval (93) I1106 16:38:07.291642 13573 net.cpp:572] detection_eval <- detection_out I1106 16:38:07.291648 13573 net.cpp:572] detection_eval <- label I1106 16:38:07.291654 13573 net.cpp:542] detection_eval -> detection_eval I1106 16:38:07.291704 13573 net.cpp:260] Setting up detection_eval I1106 16:38:07.291713 13573 net.cpp:267] TEST Top shape for layer 93 'detection_eval' 1 1 2 5 (10) I1106 16:38:07.291719 13573 net.cpp:338] detection_eval does not need backward computation. I1106 16:38:07.291725 13573 net.cpp:338] detection_out does not need backward computation. I1106 16:38:07.291731 13573 net.cpp:338] mbox_conf_flatten does not need backward computation. I1106 16:38:07.291735 13573 net.cpp:338] mbox_conf_softmax does not need backward computation. I1106 16:38:07.291740 13573 net.cpp:338] mbox_conf_reshape does not need backward computation. I1106 16:38:07.291746 13573 net.cpp:338] mbox_priorbox does not need backward computation. I1106 16:38:07.291751 13573 net.cpp:338] mbox_conf does not need backward computation. I1106 16:38:07.291757 13573 net.cpp:338] mbox_loc does not need backward computation. I1106 16:38:07.291762 13573 net.cpp:338] ctx_output4/relu_mbox_priorbox does not need backward computation. I1106 16:38:07.291769 13573 net.cpp:338] ctx_output4/relu_mbox_conf_flat does not need backward computation. I1106 16:38:07.291774 13573 net.cpp:338] ctx_output4/relu_mbox_conf_perm does not need backward computation. I1106 16:38:07.291779 13573 net.cpp:338] ctx_output4/relu_mbox_conf does not need backward computation. I1106 16:38:07.291785 13573 net.cpp:338] ctx_output4/relu_mbox_loc_flat does not need backward computation. I1106 16:38:07.291790 13573 net.cpp:338] ctx_output4/relu_mbox_loc_perm does not need backward computation. I1106 16:38:07.291795 13573 net.cpp:338] ctx_output4/relu_mbox_loc does not need backward computation. I1106 16:38:07.291800 13573 net.cpp:338] ctx_output3/relu_mbox_priorbox does not need backward computation. I1106 16:38:07.291806 13573 net.cpp:338] ctx_output3/relu_mbox_conf_flat does not need backward computation. I1106 16:38:07.291811 13573 net.cpp:338] ctx_output3/relu_mbox_conf_perm does not need backward computation. I1106 16:38:07.291816 13573 net.cpp:338] ctx_output3/relu_mbox_conf does not need backward computation. I1106 16:38:07.291821 13573 net.cpp:338] ctx_output3/relu_mbox_loc_flat does not need backward computation. I1106 16:38:07.291826 13573 net.cpp:338] ctx_output3/relu_mbox_loc_perm does not need backward computation. I1106 16:38:07.291832 13573 net.cpp:338] ctx_output3/relu_mbox_loc does not need backward computation. I1106 16:38:07.291837 13573 net.cpp:338] ctx_output2/relu_mbox_priorbox does not need backward computation. I1106 16:38:07.291842 13573 net.cpp:338] ctx_output2/relu_mbox_conf_flat does not need backward computation. I1106 16:38:07.291851 13573 net.cpp:338] ctx_output2/relu_mbox_conf_perm does not need backward computation. I1106 16:38:07.291863 13573 net.cpp:338] ctx_output2/relu_mbox_conf does not need backward computation. I1106 16:38:07.291869 13573 net.cpp:338] ctx_output2/relu_mbox_loc_flat does not need backward computation. I1106 16:38:07.291874 13573 net.cpp:338] ctx_output2/relu_mbox_loc_perm does not need backward computation. I1106 16:38:07.291878 13573 net.cpp:338] ctx_output2/relu_mbox_loc does not need backward computation. I1106 16:38:07.291883 13573 net.cpp:338] ctx_output1/relu_mbox_priorbox does not need backward computation. I1106 16:38:07.291889 13573 net.cpp:338] ctx_output1/relu_mbox_conf_flat does not need backward computation. I1106 16:38:07.291894 13573 net.cpp:338] ctx_output1/relu_mbox_conf_perm does not need backward computation. I1106 16:38:07.291899 13573 net.cpp:338] ctx_output1/relu_mbox_conf does not need backward computation. I1106 16:38:07.291904 13573 net.cpp:338] ctx_output1/relu_mbox_loc_flat does not need backward computation. I1106 16:38:07.291910 13573 net.cpp:338] ctx_output1/relu_mbox_loc_perm does not need backward computation. I1106 16:38:07.291915 13573 net.cpp:338] ctx_output1/relu_mbox_loc does not need backward computation. I1106 16:38:07.291920 13573 net.cpp:338] ctx_output5/relu does not need backward computation. I1106 16:38:07.291925 13573 net.cpp:338] ctx_output5 does not need backward computation. I1106 16:38:07.291931 13573 net.cpp:338] ctx_output4_ctx_output4/relu_0_split does not need backward computation. I1106 16:38:07.291936 13573 net.cpp:338] ctx_output4/relu does not need backward computation. I1106 16:38:07.291941 13573 net.cpp:338] ctx_output4 does not need backward computation. I1106 16:38:07.291946 13573 net.cpp:338] ctx_output3_ctx_output3/relu_0_split does not need backward computation. I1106 16:38:07.291952 13573 net.cpp:338] ctx_output3/relu does not need backward computation. I1106 16:38:07.291959 13573 net.cpp:338] ctx_output3 does not need backward computation. I1106 16:38:07.291965 13573 net.cpp:338] ctx_output2_ctx_output2/relu_0_split does not need backward computation. I1106 16:38:07.291971 13573 net.cpp:338] ctx_output2/relu does not need backward computation. I1106 16:38:07.291976 13573 net.cpp:338] ctx_output2 does not need backward computation. I1106 16:38:07.291982 13573 net.cpp:338] ctx_output1_ctx_output1/relu_0_split does not need backward computation. I1106 16:38:07.291988 13573 net.cpp:338] ctx_output1/relu does not need backward computation. I1106 16:38:07.291994 13573 net.cpp:338] ctx_output1 does not need backward computation. I1106 16:38:07.291999 13573 net.cpp:338] pool8 does not need backward computation. I1106 16:38:07.292006 13573 net.cpp:338] pool7_pool7_0_split does not need backward computation. I1106 16:38:07.292012 13573 net.cpp:338] pool7 does not need backward computation. I1106 16:38:07.292018 13573 net.cpp:338] pool6_pool6_0_split does not need backward computation. I1106 16:38:07.292023 13573 net.cpp:338] pool6 does not need backward computation. I1106 16:38:07.292029 13573 net.cpp:338] res5a_branch2b_res5a_branch2b/relu_0_split does not need backward computation. I1106 16:38:07.292034 13573 net.cpp:338] res5a_branch2b/relu does not need backward computation. I1106 16:38:07.292039 13573 net.cpp:338] res5a_branch2b/bn does not need backward computation. I1106 16:38:07.292044 13573 net.cpp:338] res5a_branch2b does not need backward computation. I1106 16:38:07.292050 13573 net.cpp:338] res5a_branch2a/relu does not need backward computation. I1106 16:38:07.292055 13573 net.cpp:338] res5a_branch2a/bn does not need backward computation. I1106 16:38:07.292060 13573 net.cpp:338] res5a_branch2a does not need backward computation. I1106 16:38:07.292065 13573 net.cpp:338] pool4 does not need backward computation. I1106 16:38:07.292070 13573 net.cpp:338] res4a_branch2b_res4a_branch2b/relu_0_split does not need backward computation. I1106 16:38:07.292076 13573 net.cpp:338] res4a_branch2b/relu does not need backward computation. I1106 16:38:07.292083 13573 net.cpp:338] res4a_branch2b/bn does not need backward computation. I1106 16:38:07.292091 13573 net.cpp:338] res4a_branch2b does not need backward computation. I1106 16:38:07.292098 13573 net.cpp:338] res4a_branch2a/relu does not need backward computation. I1106 16:38:07.292102 13573 net.cpp:338] res4a_branch2a/bn does not need backward computation. I1106 16:38:07.292107 13573 net.cpp:338] res4a_branch2a does not need backward computation. I1106 16:38:07.292112 13573 net.cpp:338] pool3 does not need backward computation. I1106 16:38:07.292117 13573 net.cpp:338] res3a_branch2b/relu does not need backward computation. I1106 16:38:07.292124 13573 net.cpp:338] res3a_branch2b/bn does not need backward computation. I1106 16:38:07.292129 13573 net.cpp:338] res3a_branch2b does not need backward computation. I1106 16:38:07.292134 13573 net.cpp:338] res3a_branch2a/relu does not need backward computation. I1106 16:38:07.292138 13573 net.cpp:338] res3a_branch2a/bn does not need backward computation. I1106 16:38:07.292143 13573 net.cpp:338] res3a_branch2a does not need backward computation. I1106 16:38:07.292148 13573 net.cpp:338] pool2 does not need backward computation. I1106 16:38:07.292153 13573 net.cpp:338] res2a_branch2b/relu does not need backward computation. I1106 16:38:07.292158 13573 net.cpp:338] res2a_branch2b/bn does not need backward computation. I1106 16:38:07.292163 13573 net.cpp:338] res2a_branch2b does not need backward computation. I1106 16:38:07.292168 13573 net.cpp:338] res2a_branch2a/relu does not need backward computation. I1106 16:38:07.292174 13573 net.cpp:338] res2a_branch2a/bn does not need backward computation. I1106 16:38:07.292179 13573 net.cpp:338] res2a_branch2a does not need backward computation. I1106 16:38:07.292184 13573 net.cpp:338] pool1 does not need backward computation. I1106 16:38:07.292189 13573 net.cpp:338] conv1b/relu does not need backward computation. I1106 16:38:07.292194 13573 net.cpp:338] conv1b/bn does not need backward computation. I1106 16:38:07.292199 13573 net.cpp:338] conv1b does not need backward computation. I1106 16:38:07.292204 13573 net.cpp:338] conv1a/relu does not need backward computation. I1106 16:38:07.292210 13573 net.cpp:338] conv1a/bn does not need backward computation. I1106 16:38:07.292214 13573 net.cpp:338] conv1a does not need backward computation. I1106 16:38:07.292219 13573 net.cpp:338] data/bias does not need backward computation. I1106 16:38:07.292225 13573 net.cpp:338] data_data_0_split does not need backward computation. I1106 16:38:07.292232 13573 net.cpp:338] data does not need backward computation. I1106 16:38:07.292237 13573 net.cpp:380] This network produces output ctx_output5 I1106 16:38:07.292241 13573 net.cpp:380] This network produces output detection_eval I1106 16:38:07.292320 13573 net.cpp:403] Top memory (TEST) required for data: 1011843208 diff: 1011843208 I1106 16:38:07.292327 13573 net.cpp:406] Bottom memory (TEST) required for data: 1011794016 diff: 1011794016 I1106 16:38:07.292332 13573 net.cpp:409] Shared (in-place) memory (TEST) by data: 498106368 diff: 498106368 I1106 16:38:07.292335 13573 net.cpp:412] Parameters memory (TEST) required for data: 11946688 diff: 11946688 I1106 16:38:07.292335 13573 net.cpp:415] Parameters shared memory (TEST) by data: 0 diff: 0 I1106 16:38:07.292338 13573 net.cpp:421] Network initialization done. I1106 16:38:07.292537 13573 solver.cpp:55] Solver scaffolding done. I1106 16:38:07.294481 13573 caffe.cpp:158] Finetuning from training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/l1reg/ti-custom-cfg1_ssdJacintoNetV2_iter_120000.caffemodel F1106 16:38:07.294518 13573 io.cpp:55] Check failed: fd != -1 (-1 vs. -1) File not found: training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/l1reg/ti-custom-cfg1_ssdJacintoNetV2_iter_120000.caffemodel *** Check failure stack trace: *** @ 0x7f9d1132a5cd google::LogMessage::Fail() @ 0x7f9d1132c433 google::LogMessage::SendToLog() @ 0x7f9d1132a15b google::LogMessage::Flush() @ 0x7f9d1132ce1e google::LogMessageFatal::~LogMessageFatal() @ 0x7f9d123386dc caffe::ReadProtoFromBinaryFile() @ 0x7f9d123b0f56 caffe::ReadNetParamsFromBinaryFileOrDie() @ 0x7f9d11ee788a caffe::Net::CopyTrainedLayersFromBinaryProto() @ 0x7f9d11ee792e caffe::Net::CopyTrainedLayersFrom() @ 0x40f889 CopyLayers() @ 0x410616 train() @ 0x40d1f0 main @ 0x7f9d0faac830 __libc_start_main @ 0x40de89 _start @ (nil) (unknown) I1106 16:38:07.800405 13608 caffe.cpp:902] This is NVCaffe 0.17.0 started at Wed Nov 6 16:38:07 2019 I1106 16:38:07.800536 13608 caffe.cpp:904] CuDNN version: 7601 I1106 16:38:07.800540 13608 caffe.cpp:905] CuBLAS version: 10201 I1106 16:38:07.800542 13608 caffe.cpp:906] CUDA version: 10010 I1106 16:38:07.800544 13608 caffe.cpp:907] CUDA driver version: 10010 I1106 16:38:07.800546 13608 caffe.cpp:908] Arguments: [0]: /home/liuyuyuan/caffe-jacinto/build/tools/caffe.bin [1]: test_detection [2]: --model=training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/test/test.prototxt [3]: --iterations=3 [4]: --display_sparsity=1 [5]: --weights=training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/sparse/ti-custom-cfg1_ssdJacintoNetV2_iter_120000.caffemodel [6]: --gpu [7]: 0 I1106 16:38:07.833681 13608 gpu_memory.cpp:105] GPUMemory::Manager initialized I1106 16:38:07.834064 13608 gpu_memory.cpp:107] Total memory: 6193479680, Free: 3152805888, dev_info[0]: total=6193479680 free=3152805888 I1106 16:38:07.834072 13608 caffe.cpp:406] Use GPU with device ID 0 I1106 16:38:07.834319 13608 caffe.cpp:409] GPU device name: GeForce GTX 1660 Ti I1106 16:38:07.845257 13608 net.cpp:80] Initializing net from parameters: name: "ssdJacintoNetV2_test" state { phase: TEST level: 0 } layer { name: "data" type: "AnnotatedData" top: "data" top: "label" include { phase: TEST } transform_param { mean_value: 0 mean_value: 0 mean_value: 0 force_color: false resize_param { prob: 1 resize_mode: WARP height: 320 width: 768 interp_mode: LINEAR } crop_h: 320 crop_w: 768 } data_param { source: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_test_lmdb" batch_size: 10 backend: LMDB threads: 4 parser_threads: 4 } annotated_data_param { batch_sampler { } label_map_file: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/labelmap.prototxt" } } layer { name: "data/bias" type: "Bias" bottom: "data" top: "data/bias" param { lr_mult: 0 decay_mult: 0 } bias_param { filler { type: "constant" value: -128 } } } layer { name: "conv1a" type: "Convolution" bottom: "data/bias" top: "conv1a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 32 bias_term: true pad: 2 kernel_size: 5 group: 1 stride: 2 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "conv1a/bn" type: "BatchNorm" bottom: "conv1a" top: "conv1a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "conv1a/relu" type: "ReLU" bottom: "conv1a" top: "conv1a" } layer { name: "conv1b" type: "Convolution" bottom: "conv1a" top: "conv1b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 32 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "conv1b/bn" type: "BatchNorm" bottom: "conv1b" top: "conv1b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "conv1b/relu" type: "ReLU" bottom: "conv1b" top: "conv1b" } layer { name: "pool1" type: "Pooling" bottom: "conv1b" top: "pool1" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res2a_branch2a" type: "Convolution" bottom: "pool1" top: "res2a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 64 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res2a_branch2a/bn" type: "BatchNorm" bottom: "res2a_branch2a" top: "res2a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res2a_branch2a/relu" type: "ReLU" bottom: "res2a_branch2a" top: "res2a_branch2a" } layer { name: "res2a_branch2b" type: "Convolution" bottom: "res2a_branch2a" top: "res2a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 64 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res2a_branch2b/bn" type: "BatchNorm" bottom: "res2a_branch2b" top: "res2a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res2a_branch2b/relu" type: "ReLU" bottom: "res2a_branch2b" top: "res2a_branch2b" } layer { name: "pool2" type: "Pooling" bottom: "res2a_branch2b" top: "pool2" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res3a_branch2a" type: "Convolution" bottom: "pool2" top: "res3a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 128 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res3a_branch2a/bn" type: "BatchNorm" bottom: "res3a_branch2a" top: "res3a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res3a_branch2a/relu" type: "ReLU" bottom: "res3a_branch2a" top: "res3a_branch2a" } layer { name: "res3a_branch2b" type: "Convolution" bottom: "res3a_branch2a" top: "res3a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 128 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res3a_branch2b/bn" type: "BatchNorm" bottom: "res3a_branch2b" top: "res3a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res3a_branch2b/relu" type: "ReLU" bottom: "res3a_branch2b" top: "res3a_branch2b" } layer { name: "pool3" type: "Pooling" bottom: "res3a_branch2b" top: "pool3" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res4a_branch2a" type: "Convolution" bottom: "pool3" top: "res4a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res4a_branch2a/bn" type: "BatchNorm" bottom: "res4a_branch2a" top: "res4a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res4a_branch2a/relu" type: "ReLU" bottom: "res4a_branch2a" top: "res4a_branch2a" } layer { name: "res4a_branch2b" type: "Convolution" bottom: "res4a_branch2a" top: "res4a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res4a_branch2b/bn" type: "BatchNorm" bottom: "res4a_branch2b" top: "res4a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res4a_branch2b/relu" type: "ReLU" bottom: "res4a_branch2b" top: "res4a_branch2b" } layer { name: "pool4" type: "Pooling" bottom: "res4a_branch2b" top: "pool4" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res5a_branch2a" type: "Convolution" bottom: "pool4" top: "res5a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 512 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res5a_branch2a/bn" type: "BatchNorm" bottom: "res5a_branch2a" top: "res5a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res5a_branch2a/relu" type: "ReLU" bottom: "res5a_branch2a" top: "res5a_branch2a" } layer { name: "res5a_branch2b" type: "Convolution" bottom: "res5a_branch2a" top: "res5a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 512 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res5a_branch2b/bn" type: "BatchNorm" bottom: "res5a_branch2b" top: "res5a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res5a_branch2b/relu" type: "ReLU" bottom: "res5a_branch2b" top: "res5a_branch2b" } layer { name: "pool6" type: "Pooling" bottom: "res5a_branch2b" top: "pool6" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "pool7" type: "Pooling" bottom: "pool6" top: "pool7" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "pool8" type: "Pooling" bottom: "pool7" top: "pool8" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "ctx_output1" type: "Convolution" bottom: "res4a_branch2b" top: "ctx_output1" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu" type: "ReLU" bottom: "ctx_output1" top: "ctx_output1" } layer { name: "ctx_output2" type: "Convolution" bottom: "res5a_branch2b" top: "ctx_output2" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu" type: "ReLU" bottom: "ctx_output2" top: "ctx_output2" } layer { name: "ctx_output3" type: "Convolution" bottom: "pool6" top: "ctx_output3" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu" type: "ReLU" bottom: "ctx_output3" top: "ctx_output3" } layer { name: "ctx_output4" type: "Convolution" bottom: "pool7" top: "ctx_output4" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu" type: "ReLU" bottom: "ctx_output4" top: "ctx_output4" } layer { name: "ctx_output5" type: "Convolution" bottom: "pool8" top: "ctx_output5" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output5/relu" type: "ReLU" bottom: "ctx_output5" top: "ctx_output5" } layer { name: "ctx_output1/relu_mbox_loc" type: "Convolution" bottom: "ctx_output1" top: "ctx_output1/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 16 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output1/relu_mbox_loc" top: "ctx_output1/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output1/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output1/relu_mbox_loc_perm" top: "ctx_output1/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output1/relu_mbox_conf" type: "Convolution" bottom: "ctx_output1" top: "ctx_output1/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 8 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output1/relu_mbox_conf" top: "ctx_output1/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output1/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output1/relu_mbox_conf_perm" top: "ctx_output1/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output1/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output1" bottom: "data" top: "ctx_output1/relu_mbox_priorbox" prior_box_param { min_size: 14.72 max_size: 36.8 aspect_ratio: 2 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output2/relu_mbox_loc" type: "Convolution" bottom: "ctx_output2" top: "ctx_output2/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 24 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output2/relu_mbox_loc" top: "ctx_output2/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output2/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output2/relu_mbox_loc_perm" top: "ctx_output2/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output2/relu_mbox_conf" type: "Convolution" bottom: "ctx_output2" top: "ctx_output2/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 12 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output2/relu_mbox_conf" top: "ctx_output2/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output2/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output2/relu_mbox_conf_perm" top: "ctx_output2/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output2/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output2" bottom: "data" top: "ctx_output2/relu_mbox_priorbox" prior_box_param { min_size: 36.8 max_size: 132.48 aspect_ratio: 2 aspect_ratio: 3 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output3/relu_mbox_loc" type: "Convolution" bottom: "ctx_output3" top: "ctx_output3/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 24 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output3/relu_mbox_loc" top: "ctx_output3/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output3/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output3/relu_mbox_loc_perm" top: "ctx_output3/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output3/relu_mbox_conf" type: "Convolution" bottom: "ctx_output3" top: "ctx_output3/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 12 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output3/relu_mbox_conf" top: "ctx_output3/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output3/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output3/relu_mbox_conf_perm" top: "ctx_output3/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output3/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output3" bottom: "data" top: "ctx_output3/relu_mbox_priorbox" prior_box_param { min_size: 132.48 max_size: 228.16 aspect_ratio: 2 aspect_ratio: 3 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output4/relu_mbox_loc" type: "Convolution" bottom: "ctx_output4" top: "ctx_output4/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 16 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output4/relu_mbox_loc" top: "ctx_output4/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output4/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output4/relu_mbox_loc_perm" top: "ctx_output4/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output4/relu_mbox_conf" type: "Convolution" bottom: "ctx_output4" top: "ctx_output4/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 8 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output4/relu_mbox_conf" top: "ctx_output4/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output4/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output4/relu_mbox_conf_perm" top: "ctx_output4/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output4/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output4" bottom: "data" top: "ctx_output4/relu_mbox_priorbox" prior_box_param { min_size: 228.16 max_size: 323.84 aspect_ratio: 2 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "mbox_loc" type: "Concat" bottom: "ctx_output1/relu_mbox_loc_flat" bottom: "ctx_output2/relu_mbox_loc_flat" bottom: "ctx_output3/relu_mbox_loc_flat" bottom: "ctx_output4/relu_mbox_loc_flat" top: "mbox_loc" concat_param { axis: 1 } } layer { name: "mbox_conf" type: "Concat" bottom: "ctx_output1/relu_mbox_conf_flat" bottom: "ctx_output2/relu_mbox_conf_flat" bottom: "ctx_output3/relu_mbox_conf_flat" bottom: "ctx_output4/relu_mbox_conf_flat" top: "mbox_conf" concat_param { axis: 1 } } layer { name: "mbox_priorbox" type: "Concat" bottom: "ctx_output1/relu_mbox_priorbox" bottom: "ctx_output2/relu_mbox_priorbox" bottom: "ctx_output3/relu_mbox_priorbox" bottom: "ctx_output4/relu_mbox_priorbox" top: "mbox_priorbox" concat_param { axis: 2 } } layer { name: "mbox_conf_reshape" type: "Reshape" bottom: "mbox_conf" top: "mbox_conf_reshape" reshape_param { shape { dim: 0 dim: -1 dim: 2 } } } layer { name: "mbox_conf_softmax" type: "Softmax" bottom: "mbox_conf_reshape" top: "mbox_conf_softmax" softmax_param { axis: 2 } } layer { name: "mbox_conf_flatten" type: "Flatten" bottom: "mbox_conf_softmax" top: "mbox_conf_flatten" flatten_param { axis: 1 } } layer { name: "detection_out" type: "DetectionOutput" bottom: "mbox_loc" bottom: "mbox_conf_flatten" bottom: "mbox_priorbox" top: "detection_out" include { phase: TEST } detection_output_param { num_classes: 2 share_location: true background_label_id: 0 nms_param { nms_threshold: 0.45 top_k: 400 } save_output_param { output_directory: "" output_name_prefix: "comp4_det_test_" output_format: "VOC" label_map_file: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/labelmap.prototxt" name_size_file: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/test_name_size.txt" num_test_image: 24 } code_type: CENTER_SIZE keep_top_k: 200 confidence_threshold: 0.01 } } layer { name: "detection_eval" type: "DetectionEvaluate" bottom: "detection_out" bottom: "label" top: "detection_eval" include { phase: TEST } detection_evaluate_param { num_classes: 2 background_label_id: 0 overlap_threshold: 0.5 evaluate_difficult_gt: false name_size_file: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/test_name_size.txt" } } I1106 16:38:07.845585 13608 net.cpp:110] Using FLOAT as default forward math type I1106 16:38:07.845592 13608 net.cpp:116] Using FLOAT as default backward math type I1106 16:38:07.845597 13608 layer_factory.hpp:172] Creating layer 'data' of type 'AnnotatedData' I1106 16:38:07.845600 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.845680 13608 internal_thread.cpp:19] Starting 1 internal thread(s) on device 0 I1106 16:38:07.846048 13608 net.cpp:200] Created Layer data (0) I1106 16:38:07.846057 13608 net.cpp:542] data -> data I1106 16:38:07.846061 13621 blocking_queue.cpp:40] Data layer prefetch queue empty I1106 16:38:07.846072 13608 net.cpp:542] data -> label I1106 16:38:07.846086 13608 data_reader.cpp:58] Data Reader threads: 1, out queues: 1, depth: 10 I1106 16:38:07.846101 13608 internal_thread.cpp:19] Starting 1 internal thread(s) on device 0 I1106 16:38:07.846554 13622 db_lmdb.cpp:36] Opened lmdb /home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_test_lmdb I1106 16:38:07.847529 13608 annotated_data_layer.cpp:105] output data size: 10,3,320,768 I1106 16:38:07.847571 13608 annotated_data_layer.cpp:150] (0) Output data size: 10, 3, 320, 768 I1106 16:38:07.847605 13608 internal_thread.cpp:19] Starting 1 internal thread(s) on device 0 I1106 16:38:07.847638 13608 net.cpp:260] Setting up data I1106 16:38:07.847651 13608 net.cpp:267] TEST Top shape for layer 0 'data' 10 3 320 768 (7372800) I1106 16:38:07.847659 13608 net.cpp:267] TEST Top shape for layer 0 'data' 1 1 2 8 (16) I1106 16:38:07.847965 13608 layer_factory.hpp:172] Creating layer 'data_data_0_split' of type 'Split' I1106 16:38:07.847970 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.847968 13623 data_layer.cpp:105] (0) Parser threads: 1 I1106 16:38:07.847975 13623 data_layer.cpp:107] (0) Transformer threads: 1 I1106 16:38:07.847977 13608 net.cpp:200] Created Layer data_data_0_split (1) I1106 16:38:07.847982 13608 net.cpp:572] data_data_0_split <- data I1106 16:38:07.847991 13608 net.cpp:542] data_data_0_split -> data_data_0_split_0 I1106 16:38:07.847995 13608 net.cpp:542] data_data_0_split -> data_data_0_split_1 I1106 16:38:07.847998 13608 net.cpp:542] data_data_0_split -> data_data_0_split_2 I1106 16:38:07.848001 13608 net.cpp:542] data_data_0_split -> data_data_0_split_3 I1106 16:38:07.848003 13608 net.cpp:542] data_data_0_split -> data_data_0_split_4 I1106 16:38:07.848044 13608 net.cpp:260] Setting up data_data_0_split I1106 16:38:07.848049 13608 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 10 3 320 768 (7372800) I1106 16:38:07.848052 13608 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 10 3 320 768 (7372800) I1106 16:38:07.848055 13608 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 10 3 320 768 (7372800) I1106 16:38:07.848058 13608 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 10 3 320 768 (7372800) I1106 16:38:07.848062 13608 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 10 3 320 768 (7372800) I1106 16:38:07.848064 13608 layer_factory.hpp:172] Creating layer 'data/bias' of type 'Bias' I1106 16:38:07.848067 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.848074 13608 net.cpp:200] Created Layer data/bias (2) I1106 16:38:07.848078 13608 net.cpp:572] data/bias <- data_data_0_split_0 I1106 16:38:07.848080 13608 net.cpp:542] data/bias -> data/bias I1106 16:38:07.848219 13608 net.cpp:260] Setting up data/bias I1106 16:38:07.848223 13608 net.cpp:267] TEST Top shape for layer 2 'data/bias' 10 3 320 768 (7372800) I1106 16:38:07.848237 13608 layer_factory.hpp:172] Creating layer 'conv1a' of type 'Convolution' I1106 16:38:07.848239 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:07.848253 13608 net.cpp:200] Created Layer conv1a (3) I1106 16:38:07.848256 13608 net.cpp:572] conv1a <- data/bias I1106 16:38:07.848258 13608 net.cpp:542] conv1a -> conv1a I1106 16:38:07.933928 13622 data_reader.cpp:320] Restarting data pre-fetching I1106 16:38:09.088867 13608 net.cpp:260] Setting up conv1a I1106 16:38:09.088943 13608 net.cpp:267] TEST Top shape for layer 3 'conv1a' 10 32 160 384 (19660800) I1106 16:38:09.088970 13608 layer_factory.hpp:172] Creating layer 'conv1a/bn' of type 'BatchNorm' I1106 16:38:09.089010 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.089036 13608 net.cpp:200] Created Layer conv1a/bn (4) I1106 16:38:09.089040 13608 net.cpp:572] conv1a/bn <- conv1a I1106 16:38:09.089047 13608 net.cpp:527] conv1a/bn -> conv1a (in-place) I1106 16:38:09.089380 13608 net.cpp:260] Setting up conv1a/bn I1106 16:38:09.089385 13608 net.cpp:267] TEST Top shape for layer 4 'conv1a/bn' 10 32 160 384 (19660800) I1106 16:38:09.089409 13608 layer_factory.hpp:172] Creating layer 'conv1a/relu' of type 'ReLU' I1106 16:38:09.089412 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.089421 13608 net.cpp:200] Created Layer conv1a/relu (5) I1106 16:38:09.089422 13608 net.cpp:572] conv1a/relu <- conv1a I1106 16:38:09.089426 13608 net.cpp:527] conv1a/relu -> conv1a (in-place) I1106 16:38:09.089439 13608 net.cpp:260] Setting up conv1a/relu I1106 16:38:09.089443 13608 net.cpp:267] TEST Top shape for layer 5 'conv1a/relu' 10 32 160 384 (19660800) I1106 16:38:09.089445 13608 layer_factory.hpp:172] Creating layer 'conv1b' of type 'Convolution' I1106 16:38:09.089448 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.089462 13608 net.cpp:200] Created Layer conv1b (6) I1106 16:38:09.089464 13608 net.cpp:572] conv1b <- conv1a I1106 16:38:09.089466 13608 net.cpp:542] conv1b -> conv1b I1106 16:38:09.090167 13608 net.cpp:260] Setting up conv1b I1106 16:38:09.090174 13608 net.cpp:267] TEST Top shape for layer 6 'conv1b' 10 32 160 384 (19660800) I1106 16:38:09.090198 13608 layer_factory.hpp:172] Creating layer 'conv1b/bn' of type 'BatchNorm' I1106 16:38:09.090200 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.090204 13608 net.cpp:200] Created Layer conv1b/bn (7) I1106 16:38:09.090207 13608 net.cpp:572] conv1b/bn <- conv1b I1106 16:38:09.090210 13608 net.cpp:527] conv1b/bn -> conv1b (in-place) I1106 16:38:09.090443 13608 net.cpp:260] Setting up conv1b/bn I1106 16:38:09.090447 13608 net.cpp:267] TEST Top shape for layer 7 'conv1b/bn' 10 32 160 384 (19660800) I1106 16:38:09.090453 13608 layer_factory.hpp:172] Creating layer 'conv1b/relu' of type 'ReLU' I1106 16:38:09.090456 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.090459 13608 net.cpp:200] Created Layer conv1b/relu (8) I1106 16:38:09.090461 13608 net.cpp:572] conv1b/relu <- conv1b I1106 16:38:09.090463 13608 net.cpp:527] conv1b/relu -> conv1b (in-place) I1106 16:38:09.090466 13608 net.cpp:260] Setting up conv1b/relu I1106 16:38:09.090469 13608 net.cpp:267] TEST Top shape for layer 8 'conv1b/relu' 10 32 160 384 (19660800) I1106 16:38:09.090471 13608 layer_factory.hpp:172] Creating layer 'pool1' of type 'Pooling' I1106 16:38:09.090473 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.090479 13608 net.cpp:200] Created Layer pool1 (9) I1106 16:38:09.090481 13608 net.cpp:572] pool1 <- conv1b I1106 16:38:09.090502 13608 net.cpp:542] pool1 -> pool1 I1106 16:38:09.090561 13608 net.cpp:260] Setting up pool1 I1106 16:38:09.090581 13608 net.cpp:267] TEST Top shape for layer 9 'pool1' 10 32 80 192 (4915200) I1106 16:38:09.090584 13608 layer_factory.hpp:172] Creating layer 'res2a_branch2a' of type 'Convolution' I1106 16:38:09.090586 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.090605 13608 net.cpp:200] Created Layer res2a_branch2a (10) I1106 16:38:09.090608 13608 net.cpp:572] res2a_branch2a <- pool1 I1106 16:38:09.090625 13608 net.cpp:542] res2a_branch2a -> res2a_branch2a I1106 16:38:09.091431 13608 net.cpp:260] Setting up res2a_branch2a I1106 16:38:09.091439 13608 net.cpp:267] TEST Top shape for layer 10 'res2a_branch2a' 10 64 80 192 (9830400) I1106 16:38:09.091445 13608 layer_factory.hpp:172] Creating layer 'res2a_branch2a/bn' of type 'BatchNorm' I1106 16:38:09.091449 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.091483 13608 net.cpp:200] Created Layer res2a_branch2a/bn (11) I1106 16:38:09.091487 13608 net.cpp:572] res2a_branch2a/bn <- res2a_branch2a I1106 16:38:09.091490 13608 net.cpp:527] res2a_branch2a/bn -> res2a_branch2a (in-place) I1106 16:38:09.091753 13608 net.cpp:260] Setting up res2a_branch2a/bn I1106 16:38:09.091758 13608 net.cpp:267] TEST Top shape for layer 11 'res2a_branch2a/bn' 10 64 80 192 (9830400) I1106 16:38:09.091763 13608 layer_factory.hpp:172] Creating layer 'res2a_branch2a/relu' of type 'ReLU' I1106 16:38:09.091766 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.091769 13608 net.cpp:200] Created Layer res2a_branch2a/relu (12) I1106 16:38:09.091771 13608 net.cpp:572] res2a_branch2a/relu <- res2a_branch2a I1106 16:38:09.091773 13608 net.cpp:527] res2a_branch2a/relu -> res2a_branch2a (in-place) I1106 16:38:09.091778 13608 net.cpp:260] Setting up res2a_branch2a/relu I1106 16:38:09.091779 13608 net.cpp:267] TEST Top shape for layer 12 'res2a_branch2a/relu' 10 64 80 192 (9830400) I1106 16:38:09.091781 13608 layer_factory.hpp:172] Creating layer 'res2a_branch2b' of type 'Convolution' I1106 16:38:09.091784 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.091789 13608 net.cpp:200] Created Layer res2a_branch2b (13) I1106 16:38:09.091791 13608 net.cpp:572] res2a_branch2b <- res2a_branch2a I1106 16:38:09.091810 13608 net.cpp:542] res2a_branch2b -> res2a_branch2b I1106 16:38:09.092083 13608 net.cpp:260] Setting up res2a_branch2b I1106 16:38:09.092089 13608 net.cpp:267] TEST Top shape for layer 13 'res2a_branch2b' 10 64 80 192 (9830400) I1106 16:38:09.092093 13608 layer_factory.hpp:172] Creating layer 'res2a_branch2b/bn' of type 'BatchNorm' I1106 16:38:09.092095 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.092103 13608 net.cpp:200] Created Layer res2a_branch2b/bn (14) I1106 16:38:09.092106 13608 net.cpp:572] res2a_branch2b/bn <- res2a_branch2b I1106 16:38:09.092109 13608 net.cpp:527] res2a_branch2b/bn -> res2a_branch2b (in-place) I1106 16:38:09.092358 13608 net.cpp:260] Setting up res2a_branch2b/bn I1106 16:38:09.092363 13608 net.cpp:267] TEST Top shape for layer 14 'res2a_branch2b/bn' 10 64 80 192 (9830400) I1106 16:38:09.092370 13608 layer_factory.hpp:172] Creating layer 'res2a_branch2b/relu' of type 'ReLU' I1106 16:38:09.092370 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.092375 13608 net.cpp:200] Created Layer res2a_branch2b/relu (15) I1106 16:38:09.092376 13608 net.cpp:572] res2a_branch2b/relu <- res2a_branch2b I1106 16:38:09.092380 13608 net.cpp:527] res2a_branch2b/relu -> res2a_branch2b (in-place) I1106 16:38:09.092382 13608 net.cpp:260] Setting up res2a_branch2b/relu I1106 16:38:09.092386 13608 net.cpp:267] TEST Top shape for layer 15 'res2a_branch2b/relu' 10 64 80 192 (9830400) I1106 16:38:09.092387 13608 layer_factory.hpp:172] Creating layer 'pool2' of type 'Pooling' I1106 16:38:09.092389 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.092396 13608 net.cpp:200] Created Layer pool2 (16) I1106 16:38:09.092412 13608 net.cpp:572] pool2 <- res2a_branch2b I1106 16:38:09.092417 13608 net.cpp:542] pool2 -> pool2 I1106 16:38:09.092468 13608 net.cpp:260] Setting up pool2 I1106 16:38:09.092471 13608 net.cpp:267] TEST Top shape for layer 16 'pool2' 10 64 40 96 (2457600) I1106 16:38:09.092473 13608 layer_factory.hpp:172] Creating layer 'res3a_branch2a' of type 'Convolution' I1106 16:38:09.092491 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.092501 13608 net.cpp:200] Created Layer res3a_branch2a (17) I1106 16:38:09.092504 13608 net.cpp:572] res3a_branch2a <- pool2 I1106 16:38:09.092506 13608 net.cpp:542] res3a_branch2a -> res3a_branch2a I1106 16:38:09.093221 13608 net.cpp:260] Setting up res3a_branch2a I1106 16:38:09.093235 13608 net.cpp:267] TEST Top shape for layer 17 'res3a_branch2a' 10 128 40 96 (4915200) I1106 16:38:09.093238 13608 layer_factory.hpp:172] Creating layer 'res3a_branch2a/bn' of type 'BatchNorm' I1106 16:38:09.093241 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.093246 13608 net.cpp:200] Created Layer res3a_branch2a/bn (18) I1106 16:38:09.093248 13608 net.cpp:572] res3a_branch2a/bn <- res3a_branch2a I1106 16:38:09.093251 13608 net.cpp:527] res3a_branch2a/bn -> res3a_branch2a (in-place) I1106 16:38:09.093490 13608 net.cpp:260] Setting up res3a_branch2a/bn I1106 16:38:09.093495 13608 net.cpp:267] TEST Top shape for layer 18 'res3a_branch2a/bn' 10 128 40 96 (4915200) I1106 16:38:09.093503 13608 layer_factory.hpp:172] Creating layer 'res3a_branch2a/relu' of type 'ReLU' I1106 16:38:09.093506 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.093509 13608 net.cpp:200] Created Layer res3a_branch2a/relu (19) I1106 16:38:09.093511 13608 net.cpp:572] res3a_branch2a/relu <- res3a_branch2a I1106 16:38:09.093513 13608 net.cpp:527] res3a_branch2a/relu -> res3a_branch2a (in-place) I1106 16:38:09.093518 13608 net.cpp:260] Setting up res3a_branch2a/relu I1106 16:38:09.093519 13608 net.cpp:267] TEST Top shape for layer 19 'res3a_branch2a/relu' 10 128 40 96 (4915200) I1106 16:38:09.093521 13608 layer_factory.hpp:172] Creating layer 'res3a_branch2b' of type 'Convolution' I1106 16:38:09.093523 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.093547 13608 net.cpp:200] Created Layer res3a_branch2b (20) I1106 16:38:09.093549 13608 net.cpp:572] res3a_branch2b <- res3a_branch2a I1106 16:38:09.093552 13608 net.cpp:542] res3a_branch2b -> res3a_branch2b I1106 16:38:09.093978 13608 net.cpp:260] Setting up res3a_branch2b I1106 16:38:09.093984 13608 net.cpp:267] TEST Top shape for layer 20 'res3a_branch2b' 10 128 40 96 (4915200) I1106 16:38:09.093988 13608 layer_factory.hpp:172] Creating layer 'res3a_branch2b/bn' of type 'BatchNorm' I1106 16:38:09.093991 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.093996 13608 net.cpp:200] Created Layer res3a_branch2b/bn (21) I1106 16:38:09.093998 13608 net.cpp:572] res3a_branch2b/bn <- res3a_branch2b I1106 16:38:09.094000 13608 net.cpp:527] res3a_branch2b/bn -> res3a_branch2b (in-place) I1106 16:38:09.094202 13608 net.cpp:260] Setting up res3a_branch2b/bn I1106 16:38:09.094208 13608 net.cpp:267] TEST Top shape for layer 21 'res3a_branch2b/bn' 10 128 40 96 (4915200) I1106 16:38:09.094213 13608 layer_factory.hpp:172] Creating layer 'res3a_branch2b/relu' of type 'ReLU' I1106 16:38:09.094214 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.094218 13608 net.cpp:200] Created Layer res3a_branch2b/relu (22) I1106 16:38:09.094219 13608 net.cpp:572] res3a_branch2b/relu <- res3a_branch2b I1106 16:38:09.094223 13608 net.cpp:527] res3a_branch2b/relu -> res3a_branch2b (in-place) I1106 16:38:09.094225 13608 net.cpp:260] Setting up res3a_branch2b/relu I1106 16:38:09.094228 13608 net.cpp:267] TEST Top shape for layer 22 'res3a_branch2b/relu' 10 128 40 96 (4915200) I1106 16:38:09.094229 13608 layer_factory.hpp:172] Creating layer 'pool3' of type 'Pooling' I1106 16:38:09.094231 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.094236 13608 net.cpp:200] Created Layer pool3 (23) I1106 16:38:09.094238 13608 net.cpp:572] pool3 <- res3a_branch2b I1106 16:38:09.094256 13608 net.cpp:542] pool3 -> pool3 I1106 16:38:09.094285 13608 net.cpp:260] Setting up pool3 I1106 16:38:09.094311 13608 net.cpp:267] TEST Top shape for layer 23 'pool3' 10 128 20 48 (1228800) I1106 16:38:09.094313 13608 layer_factory.hpp:172] Creating layer 'res4a_branch2a' of type 'Convolution' I1106 16:38:09.094316 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.094347 13608 net.cpp:200] Created Layer res4a_branch2a (24) I1106 16:38:09.094350 13608 net.cpp:572] res4a_branch2a <- pool3 I1106 16:38:09.094352 13608 net.cpp:542] res4a_branch2a -> res4a_branch2a I1106 16:38:09.096937 13608 net.cpp:260] Setting up res4a_branch2a I1106 16:38:09.096946 13608 net.cpp:267] TEST Top shape for layer 24 'res4a_branch2a' 10 256 20 48 (2457600) I1106 16:38:09.096951 13608 layer_factory.hpp:172] Creating layer 'res4a_branch2a/bn' of type 'BatchNorm' I1106 16:38:09.096952 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.096957 13608 net.cpp:200] Created Layer res4a_branch2a/bn (25) I1106 16:38:09.096961 13608 net.cpp:572] res4a_branch2a/bn <- res4a_branch2a I1106 16:38:09.096963 13608 net.cpp:527] res4a_branch2a/bn -> res4a_branch2a (in-place) I1106 16:38:09.097211 13608 net.cpp:260] Setting up res4a_branch2a/bn I1106 16:38:09.097215 13608 net.cpp:267] TEST Top shape for layer 25 'res4a_branch2a/bn' 10 256 20 48 (2457600) I1106 16:38:09.097221 13608 layer_factory.hpp:172] Creating layer 'res4a_branch2a/relu' of type 'ReLU' I1106 16:38:09.097223 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.097229 13608 net.cpp:200] Created Layer res4a_branch2a/relu (26) I1106 16:38:09.097230 13608 net.cpp:572] res4a_branch2a/relu <- res4a_branch2a I1106 16:38:09.097234 13608 net.cpp:527] res4a_branch2a/relu -> res4a_branch2a (in-place) I1106 16:38:09.097236 13608 net.cpp:260] Setting up res4a_branch2a/relu I1106 16:38:09.097239 13608 net.cpp:267] TEST Top shape for layer 26 'res4a_branch2a/relu' 10 256 20 48 (2457600) I1106 16:38:09.097241 13608 layer_factory.hpp:172] Creating layer 'res4a_branch2b' of type 'Convolution' I1106 16:38:09.097244 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.097270 13608 net.cpp:200] Created Layer res4a_branch2b (27) I1106 16:38:09.097273 13608 net.cpp:572] res4a_branch2b <- res4a_branch2a I1106 16:38:09.097275 13608 net.cpp:542] res4a_branch2b -> res4a_branch2b I1106 16:38:09.098508 13608 net.cpp:260] Setting up res4a_branch2b I1106 16:38:09.098515 13608 net.cpp:267] TEST Top shape for layer 27 'res4a_branch2b' 10 256 20 48 (2457600) I1106 16:38:09.098541 13608 layer_factory.hpp:172] Creating layer 'res4a_branch2b/bn' of type 'BatchNorm' I1106 16:38:09.098542 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.098547 13608 net.cpp:200] Created Layer res4a_branch2b/bn (28) I1106 16:38:09.098549 13608 net.cpp:572] res4a_branch2b/bn <- res4a_branch2b I1106 16:38:09.098552 13608 net.cpp:527] res4a_branch2b/bn -> res4a_branch2b (in-place) I1106 16:38:09.098783 13608 net.cpp:260] Setting up res4a_branch2b/bn I1106 16:38:09.098788 13608 net.cpp:267] TEST Top shape for layer 28 'res4a_branch2b/bn' 10 256 20 48 (2457600) I1106 16:38:09.098793 13608 layer_factory.hpp:172] Creating layer 'res4a_branch2b/relu' of type 'ReLU' I1106 16:38:09.098796 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.098799 13608 net.cpp:200] Created Layer res4a_branch2b/relu (29) I1106 16:38:09.098803 13608 net.cpp:572] res4a_branch2b/relu <- res4a_branch2b I1106 16:38:09.098804 13608 net.cpp:527] res4a_branch2b/relu -> res4a_branch2b (in-place) I1106 16:38:09.098809 13608 net.cpp:260] Setting up res4a_branch2b/relu I1106 16:38:09.098810 13608 net.cpp:267] TEST Top shape for layer 29 'res4a_branch2b/relu' 10 256 20 48 (2457600) I1106 16:38:09.098812 13608 layer_factory.hpp:172] Creating layer 'res4a_branch2b_res4a_branch2b/relu_0_split' of type 'Split' I1106 16:38:09.098815 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.098819 13608 net.cpp:200] Created Layer res4a_branch2b_res4a_branch2b/relu_0_split (30) I1106 16:38:09.098820 13608 net.cpp:572] res4a_branch2b_res4a_branch2b/relu_0_split <- res4a_branch2b I1106 16:38:09.098850 13608 net.cpp:542] res4a_branch2b_res4a_branch2b/relu_0_split -> res4a_branch2b_res4a_branch2b/relu_0_split_0 I1106 16:38:09.098855 13608 net.cpp:542] res4a_branch2b_res4a_branch2b/relu_0_split -> res4a_branch2b_res4a_branch2b/relu_0_split_1 I1106 16:38:09.098879 13608 net.cpp:260] Setting up res4a_branch2b_res4a_branch2b/relu_0_split I1106 16:38:09.098897 13608 net.cpp:267] TEST Top shape for layer 30 'res4a_branch2b_res4a_branch2b/relu_0_split' 10 256 20 48 (2457600) I1106 16:38:09.098906 13608 net.cpp:267] TEST Top shape for layer 30 'res4a_branch2b_res4a_branch2b/relu_0_split' 10 256 20 48 (2457600) I1106 16:38:09.098912 13608 layer_factory.hpp:172] Creating layer 'pool4' of type 'Pooling' I1106 16:38:09.098914 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.098919 13608 net.cpp:200] Created Layer pool4 (31) I1106 16:38:09.098922 13608 net.cpp:572] pool4 <- res4a_branch2b_res4a_branch2b/relu_0_split_0 I1106 16:38:09.098925 13608 net.cpp:542] pool4 -> pool4 I1106 16:38:09.098994 13608 net.cpp:260] Setting up pool4 I1106 16:38:09.099017 13608 net.cpp:267] TEST Top shape for layer 31 'pool4' 10 256 10 24 (614400) I1106 16:38:09.099040 13608 layer_factory.hpp:172] Creating layer 'res5a_branch2a' of type 'Convolution' I1106 16:38:09.099045 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.099056 13608 net.cpp:200] Created Layer res5a_branch2a (32) I1106 16:38:09.099058 13608 net.cpp:572] res5a_branch2a <- pool4 I1106 16:38:09.099061 13608 net.cpp:542] res5a_branch2a -> res5a_branch2a I1106 16:38:09.108988 13608 net.cpp:260] Setting up res5a_branch2a I1106 16:38:09.109104 13608 net.cpp:267] TEST Top shape for layer 32 'res5a_branch2a' 10 512 10 24 (1228800) I1106 16:38:09.109125 13608 layer_factory.hpp:172] Creating layer 'res5a_branch2a/bn' of type 'BatchNorm' I1106 16:38:09.109138 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.109163 13608 net.cpp:200] Created Layer res5a_branch2a/bn (33) I1106 16:38:09.109174 13608 net.cpp:572] res5a_branch2a/bn <- res5a_branch2a I1106 16:38:09.109184 13608 net.cpp:527] res5a_branch2a/bn -> res5a_branch2a (in-place) I1106 16:38:09.109441 13608 net.cpp:260] Setting up res5a_branch2a/bn I1106 16:38:09.109454 13608 net.cpp:267] TEST Top shape for layer 33 'res5a_branch2a/bn' 10 512 10 24 (1228800) I1106 16:38:09.109465 13608 layer_factory.hpp:172] Creating layer 'res5a_branch2a/relu' of type 'ReLU' I1106 16:38:09.109472 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.109480 13608 net.cpp:200] Created Layer res5a_branch2a/relu (34) I1106 16:38:09.109486 13608 net.cpp:572] res5a_branch2a/relu <- res5a_branch2a I1106 16:38:09.109494 13608 net.cpp:527] res5a_branch2a/relu -> res5a_branch2a (in-place) I1106 16:38:09.109503 13608 net.cpp:260] Setting up res5a_branch2a/relu I1106 16:38:09.109513 13608 net.cpp:267] TEST Top shape for layer 34 'res5a_branch2a/relu' 10 512 10 24 (1228800) I1106 16:38:09.109517 13608 layer_factory.hpp:172] Creating layer 'res5a_branch2b' of type 'Convolution' I1106 16:38:09.109524 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.109539 13608 net.cpp:200] Created Layer res5a_branch2b (35) I1106 16:38:09.109545 13608 net.cpp:572] res5a_branch2b <- res5a_branch2a I1106 16:38:09.109552 13608 net.cpp:542] res5a_branch2b -> res5a_branch2b I1106 16:38:09.114460 13608 net.cpp:260] Setting up res5a_branch2b I1106 16:38:09.114492 13608 net.cpp:267] TEST Top shape for layer 35 'res5a_branch2b' 10 512 10 24 (1228800) I1106 16:38:09.114508 13608 layer_factory.hpp:172] Creating layer 'res5a_branch2b/bn' of type 'BatchNorm' I1106 16:38:09.114516 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.114537 13608 net.cpp:200] Created Layer res5a_branch2b/bn (36) I1106 16:38:09.114552 13608 net.cpp:572] res5a_branch2b/bn <- res5a_branch2b I1106 16:38:09.114559 13608 net.cpp:527] res5a_branch2b/bn -> res5a_branch2b (in-place) I1106 16:38:09.114761 13608 net.cpp:260] Setting up res5a_branch2b/bn I1106 16:38:09.114773 13608 net.cpp:267] TEST Top shape for layer 36 'res5a_branch2b/bn' 10 512 10 24 (1228800) I1106 16:38:09.114784 13608 layer_factory.hpp:172] Creating layer 'res5a_branch2b/relu' of type 'ReLU' I1106 16:38:09.114789 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.114799 13608 net.cpp:200] Created Layer res5a_branch2b/relu (37) I1106 16:38:09.114805 13608 net.cpp:572] res5a_branch2b/relu <- res5a_branch2b I1106 16:38:09.114812 13608 net.cpp:527] res5a_branch2b/relu -> res5a_branch2b (in-place) I1106 16:38:09.114820 13608 net.cpp:260] Setting up res5a_branch2b/relu I1106 16:38:09.114827 13608 net.cpp:267] TEST Top shape for layer 37 'res5a_branch2b/relu' 10 512 10 24 (1228800) I1106 16:38:09.114833 13608 layer_factory.hpp:172] Creating layer 'res5a_branch2b_res5a_branch2b/relu_0_split' of type 'Split' I1106 16:38:09.114840 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.114848 13608 net.cpp:200] Created Layer res5a_branch2b_res5a_branch2b/relu_0_split (38) I1106 16:38:09.114854 13608 net.cpp:572] res5a_branch2b_res5a_branch2b/relu_0_split <- res5a_branch2b I1106 16:38:09.114861 13608 net.cpp:542] res5a_branch2b_res5a_branch2b/relu_0_split -> res5a_branch2b_res5a_branch2b/relu_0_split_0 I1106 16:38:09.114869 13608 net.cpp:542] res5a_branch2b_res5a_branch2b/relu_0_split -> res5a_branch2b_res5a_branch2b/relu_0_split_1 I1106 16:38:09.114897 13608 net.cpp:260] Setting up res5a_branch2b_res5a_branch2b/relu_0_split I1106 16:38:09.114905 13608 net.cpp:267] TEST Top shape for layer 38 'res5a_branch2b_res5a_branch2b/relu_0_split' 10 512 10 24 (1228800) I1106 16:38:09.114912 13608 net.cpp:267] TEST Top shape for layer 38 'res5a_branch2b_res5a_branch2b/relu_0_split' 10 512 10 24 (1228800) I1106 16:38:09.114919 13608 layer_factory.hpp:172] Creating layer 'pool6' of type 'Pooling' I1106 16:38:09.114925 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.114936 13608 net.cpp:200] Created Layer pool6 (39) I1106 16:38:09.114943 13608 net.cpp:572] pool6 <- res5a_branch2b_res5a_branch2b/relu_0_split_0 I1106 16:38:09.114951 13608 net.cpp:542] pool6 -> pool6 I1106 16:38:09.114984 13608 net.cpp:260] Setting up pool6 I1106 16:38:09.114993 13608 net.cpp:267] TEST Top shape for layer 39 'pool6' 10 512 5 12 (307200) I1106 16:38:09.115000 13608 layer_factory.hpp:172] Creating layer 'pool6_pool6_0_split' of type 'Split' I1106 16:38:09.115006 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.115015 13608 net.cpp:200] Created Layer pool6_pool6_0_split (40) I1106 16:38:09.115021 13608 net.cpp:572] pool6_pool6_0_split <- pool6 I1106 16:38:09.115027 13608 net.cpp:542] pool6_pool6_0_split -> pool6_pool6_0_split_0 I1106 16:38:09.115036 13608 net.cpp:542] pool6_pool6_0_split -> pool6_pool6_0_split_1 I1106 16:38:09.115059 13608 net.cpp:260] Setting up pool6_pool6_0_split I1106 16:38:09.115064 13608 net.cpp:267] TEST Top shape for layer 40 'pool6_pool6_0_split' 10 512 5 12 (307200) I1106 16:38:09.115067 13608 net.cpp:267] TEST Top shape for layer 40 'pool6_pool6_0_split' 10 512 5 12 (307200) I1106 16:38:09.115069 13608 layer_factory.hpp:172] Creating layer 'pool7' of type 'Pooling' I1106 16:38:09.115072 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.115075 13608 net.cpp:200] Created Layer pool7 (41) I1106 16:38:09.115077 13608 net.cpp:572] pool7 <- pool6_pool6_0_split_0 I1106 16:38:09.115080 13608 net.cpp:542] pool7 -> pool7 I1106 16:38:09.115106 13608 net.cpp:260] Setting up pool7 I1106 16:38:09.115115 13608 net.cpp:267] TEST Top shape for layer 41 'pool7' 10 512 3 6 (92160) I1106 16:38:09.115119 13608 layer_factory.hpp:172] Creating layer 'pool7_pool7_0_split' of type 'Split' I1106 16:38:09.115128 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.115134 13608 net.cpp:200] Created Layer pool7_pool7_0_split (42) I1106 16:38:09.115137 13608 net.cpp:572] pool7_pool7_0_split <- pool7 I1106 16:38:09.115140 13608 net.cpp:542] pool7_pool7_0_split -> pool7_pool7_0_split_0 I1106 16:38:09.115147 13608 net.cpp:542] pool7_pool7_0_split -> pool7_pool7_0_split_1 I1106 16:38:09.115168 13608 net.cpp:260] Setting up pool7_pool7_0_split I1106 16:38:09.115172 13608 net.cpp:267] TEST Top shape for layer 42 'pool7_pool7_0_split' 10 512 3 6 (92160) I1106 16:38:09.115175 13608 net.cpp:267] TEST Top shape for layer 42 'pool7_pool7_0_split' 10 512 3 6 (92160) I1106 16:38:09.115177 13608 layer_factory.hpp:172] Creating layer 'pool8' of type 'Pooling' I1106 16:38:09.115180 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.115185 13608 net.cpp:200] Created Layer pool8 (43) I1106 16:38:09.115187 13608 net.cpp:572] pool8 <- pool7_pool7_0_split_0 I1106 16:38:09.115190 13608 net.cpp:542] pool8 -> pool8 I1106 16:38:09.115216 13608 net.cpp:260] Setting up pool8 I1106 16:38:09.115226 13608 net.cpp:267] TEST Top shape for layer 43 'pool8' 10 512 2 3 (30720) I1106 16:38:09.115229 13608 layer_factory.hpp:172] Creating layer 'ctx_output1' of type 'Convolution' I1106 16:38:09.115232 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.115247 13608 net.cpp:200] Created Layer ctx_output1 (44) I1106 16:38:09.115250 13608 net.cpp:572] ctx_output1 <- res4a_branch2b_res4a_branch2b/relu_0_split_1 I1106 16:38:09.115253 13608 net.cpp:542] ctx_output1 -> ctx_output1 I1106 16:38:09.115867 13608 net.cpp:260] Setting up ctx_output1 I1106 16:38:09.115875 13608 net.cpp:267] TEST Top shape for layer 44 'ctx_output1' 10 256 20 48 (2457600) I1106 16:38:09.115880 13608 layer_factory.hpp:172] Creating layer 'ctx_output1/relu' of type 'ReLU' I1106 16:38:09.115881 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.115886 13608 net.cpp:200] Created Layer ctx_output1/relu (45) I1106 16:38:09.115888 13608 net.cpp:572] ctx_output1/relu <- ctx_output1 I1106 16:38:09.115890 13608 net.cpp:527] ctx_output1/relu -> ctx_output1 (in-place) I1106 16:38:09.115895 13608 net.cpp:260] Setting up ctx_output1/relu I1106 16:38:09.115897 13608 net.cpp:267] TEST Top shape for layer 45 'ctx_output1/relu' 10 256 20 48 (2457600) I1106 16:38:09.115900 13608 layer_factory.hpp:172] Creating layer 'ctx_output1_ctx_output1/relu_0_split' of type 'Split' I1106 16:38:09.115902 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.115906 13608 net.cpp:200] Created Layer ctx_output1_ctx_output1/relu_0_split (46) I1106 16:38:09.115907 13608 net.cpp:572] ctx_output1_ctx_output1/relu_0_split <- ctx_output1 I1106 16:38:09.115909 13608 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_0 I1106 16:38:09.115921 13608 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_1 I1106 16:38:09.115926 13608 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_2 I1106 16:38:09.115963 13608 net.cpp:260] Setting up ctx_output1_ctx_output1/relu_0_split I1106 16:38:09.115968 13608 net.cpp:267] TEST Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 10 256 20 48 (2457600) I1106 16:38:09.115970 13608 net.cpp:267] TEST Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 10 256 20 48 (2457600) I1106 16:38:09.115973 13608 net.cpp:267] TEST Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 10 256 20 48 (2457600) I1106 16:38:09.115976 13608 layer_factory.hpp:172] Creating layer 'ctx_output2' of type 'Convolution' I1106 16:38:09.115979 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.115995 13608 net.cpp:200] Created Layer ctx_output2 (47) I1106 16:38:09.115998 13608 net.cpp:572] ctx_output2 <- res5a_branch2b_res5a_branch2b/relu_0_split_1 I1106 16:38:09.116003 13608 net.cpp:542] ctx_output2 -> ctx_output2 I1106 16:38:09.117043 13608 net.cpp:260] Setting up ctx_output2 I1106 16:38:09.117049 13608 net.cpp:267] TEST Top shape for layer 47 'ctx_output2' 10 256 10 24 (614400) I1106 16:38:09.117053 13608 layer_factory.hpp:172] Creating layer 'ctx_output2/relu' of type 'ReLU' I1106 16:38:09.117056 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.117059 13608 net.cpp:200] Created Layer ctx_output2/relu (48) I1106 16:38:09.117063 13608 net.cpp:572] ctx_output2/relu <- ctx_output2 I1106 16:38:09.117065 13608 net.cpp:527] ctx_output2/relu -> ctx_output2 (in-place) I1106 16:38:09.117069 13608 net.cpp:260] Setting up ctx_output2/relu I1106 16:38:09.117072 13608 net.cpp:267] TEST Top shape for layer 48 'ctx_output2/relu' 10 256 10 24 (614400) I1106 16:38:09.117075 13608 layer_factory.hpp:172] Creating layer 'ctx_output2_ctx_output2/relu_0_split' of type 'Split' I1106 16:38:09.117079 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.117082 13608 net.cpp:200] Created Layer ctx_output2_ctx_output2/relu_0_split (49) I1106 16:38:09.117084 13608 net.cpp:572] ctx_output2_ctx_output2/relu_0_split <- ctx_output2 I1106 16:38:09.117087 13608 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_0 I1106 16:38:09.117091 13608 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_1 I1106 16:38:09.117095 13608 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_2 I1106 16:38:09.117125 13608 net.cpp:260] Setting up ctx_output2_ctx_output2/relu_0_split I1106 16:38:09.117127 13608 net.cpp:267] TEST Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 10 256 10 24 (614400) I1106 16:38:09.117130 13608 net.cpp:267] TEST Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 10 256 10 24 (614400) I1106 16:38:09.117133 13608 net.cpp:267] TEST Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 10 256 10 24 (614400) I1106 16:38:09.117136 13608 layer_factory.hpp:172] Creating layer 'ctx_output3' of type 'Convolution' I1106 16:38:09.117138 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.117147 13608 net.cpp:200] Created Layer ctx_output3 (50) I1106 16:38:09.117151 13608 net.cpp:572] ctx_output3 <- pool6_pool6_0_split_1 I1106 16:38:09.117153 13608 net.cpp:542] ctx_output3 -> ctx_output3 I1106 16:38:09.118744 13608 net.cpp:260] Setting up ctx_output3 I1106 16:38:09.118754 13608 net.cpp:267] TEST Top shape for layer 50 'ctx_output3' 10 256 5 12 (153600) I1106 16:38:09.118759 13608 layer_factory.hpp:172] Creating layer 'ctx_output3/relu' of type 'ReLU' I1106 16:38:09.118762 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.118765 13608 net.cpp:200] Created Layer ctx_output3/relu (51) I1106 16:38:09.118767 13608 net.cpp:572] ctx_output3/relu <- ctx_output3 I1106 16:38:09.118770 13608 net.cpp:527] ctx_output3/relu -> ctx_output3 (in-place) I1106 16:38:09.118777 13608 net.cpp:260] Setting up ctx_output3/relu I1106 16:38:09.118778 13608 net.cpp:267] TEST Top shape for layer 51 'ctx_output3/relu' 10 256 5 12 (153600) I1106 16:38:09.118780 13608 layer_factory.hpp:172] Creating layer 'ctx_output3_ctx_output3/relu_0_split' of type 'Split' I1106 16:38:09.118783 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.118785 13608 net.cpp:200] Created Layer ctx_output3_ctx_output3/relu_0_split (52) I1106 16:38:09.118789 13608 net.cpp:572] ctx_output3_ctx_output3/relu_0_split <- ctx_output3 I1106 16:38:09.118793 13608 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_0 I1106 16:38:09.118806 13608 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_1 I1106 16:38:09.118809 13608 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_2 I1106 16:38:09.118841 13608 net.cpp:260] Setting up ctx_output3_ctx_output3/relu_0_split I1106 16:38:09.118845 13608 net.cpp:267] TEST Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 10 256 5 12 (153600) I1106 16:38:09.118849 13608 net.cpp:267] TEST Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 10 256 5 12 (153600) I1106 16:38:09.118852 13608 net.cpp:267] TEST Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 10 256 5 12 (153600) I1106 16:38:09.118855 13608 layer_factory.hpp:172] Creating layer 'ctx_output4' of type 'Convolution' I1106 16:38:09.118858 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.118868 13608 net.cpp:200] Created Layer ctx_output4 (53) I1106 16:38:09.118871 13608 net.cpp:572] ctx_output4 <- pool7_pool7_0_split_1 I1106 16:38:09.118875 13608 net.cpp:542] ctx_output4 -> ctx_output4 I1106 16:38:09.119917 13608 net.cpp:260] Setting up ctx_output4 I1106 16:38:09.119932 13608 net.cpp:267] TEST Top shape for layer 53 'ctx_output4' 10 256 3 6 (46080) I1106 16:38:09.119941 13608 layer_factory.hpp:172] Creating layer 'ctx_output4/relu' of type 'ReLU' I1106 16:38:09.119947 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.119954 13608 net.cpp:200] Created Layer ctx_output4/relu (54) I1106 16:38:09.119961 13608 net.cpp:572] ctx_output4/relu <- ctx_output4 I1106 16:38:09.119966 13608 net.cpp:527] ctx_output4/relu -> ctx_output4 (in-place) I1106 16:38:09.119974 13608 net.cpp:260] Setting up ctx_output4/relu I1106 16:38:09.119982 13608 net.cpp:267] TEST Top shape for layer 54 'ctx_output4/relu' 10 256 3 6 (46080) I1106 16:38:09.119987 13608 layer_factory.hpp:172] Creating layer 'ctx_output4_ctx_output4/relu_0_split' of type 'Split' I1106 16:38:09.119993 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.119999 13608 net.cpp:200] Created Layer ctx_output4_ctx_output4/relu_0_split (55) I1106 16:38:09.120005 13608 net.cpp:572] ctx_output4_ctx_output4/relu_0_split <- ctx_output4 I1106 16:38:09.120010 13608 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_0 I1106 16:38:09.120018 13608 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_1 I1106 16:38:09.120025 13608 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_2 I1106 16:38:09.120059 13608 net.cpp:260] Setting up ctx_output4_ctx_output4/relu_0_split I1106 16:38:09.120069 13608 net.cpp:267] TEST Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 10 256 3 6 (46080) I1106 16:38:09.120075 13608 net.cpp:267] TEST Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 10 256 3 6 (46080) I1106 16:38:09.120080 13608 net.cpp:267] TEST Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 10 256 3 6 (46080) I1106 16:38:09.120086 13608 layer_factory.hpp:172] Creating layer 'ctx_output5' of type 'Convolution' I1106 16:38:09.120091 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.120103 13608 net.cpp:200] Created Layer ctx_output5 (56) I1106 16:38:09.120110 13608 net.cpp:572] ctx_output5 <- pool8 I1106 16:38:09.120116 13608 net.cpp:542] ctx_output5 -> ctx_output5 I1106 16:38:09.121155 13608 net.cpp:260] Setting up ctx_output5 I1106 16:38:09.121167 13608 net.cpp:267] TEST Top shape for layer 56 'ctx_output5' 10 256 2 3 (15360) I1106 16:38:09.121176 13608 layer_factory.hpp:172] Creating layer 'ctx_output5/relu' of type 'ReLU' I1106 16:38:09.121181 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.121194 13608 net.cpp:200] Created Layer ctx_output5/relu (57) I1106 16:38:09.121201 13608 net.cpp:572] ctx_output5/relu <- ctx_output5 I1106 16:38:09.121213 13608 net.cpp:527] ctx_output5/relu -> ctx_output5 (in-place) I1106 16:38:09.121222 13608 net.cpp:260] Setting up ctx_output5/relu I1106 16:38:09.121228 13608 net.cpp:267] TEST Top shape for layer 57 'ctx_output5/relu' 10 256 2 3 (15360) I1106 16:38:09.121233 13608 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc' of type 'Convolution' I1106 16:38:09.121239 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.121253 13608 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc (58) I1106 16:38:09.121260 13608 net.cpp:572] ctx_output1/relu_mbox_loc <- ctx_output1_ctx_output1/relu_0_split_0 I1106 16:38:09.121268 13608 net.cpp:542] ctx_output1/relu_mbox_loc -> ctx_output1/relu_mbox_loc I1106 16:38:09.121439 13608 net.cpp:260] Setting up ctx_output1/relu_mbox_loc I1106 16:38:09.121446 13608 net.cpp:267] TEST Top shape for layer 58 'ctx_output1/relu_mbox_loc' 10 16 20 48 (153600) I1106 16:38:09.121450 13608 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:09.121454 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.121462 13608 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc_perm (59) I1106 16:38:09.121465 13608 net.cpp:572] ctx_output1/relu_mbox_loc_perm <- ctx_output1/relu_mbox_loc I1106 16:38:09.121469 13608 net.cpp:542] ctx_output1/relu_mbox_loc_perm -> ctx_output1/relu_mbox_loc_perm I1106 16:38:09.121538 13608 net.cpp:260] Setting up ctx_output1/relu_mbox_loc_perm I1106 16:38:09.121546 13608 net.cpp:267] TEST Top shape for layer 59 'ctx_output1/relu_mbox_loc_perm' 10 20 48 16 (153600) I1106 16:38:09.121547 13608 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:09.121551 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.121554 13608 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc_flat (60) I1106 16:38:09.121562 13608 net.cpp:572] ctx_output1/relu_mbox_loc_flat <- ctx_output1/relu_mbox_loc_perm I1106 16:38:09.121567 13608 net.cpp:542] ctx_output1/relu_mbox_loc_flat -> ctx_output1/relu_mbox_loc_flat I1106 16:38:09.122157 13608 net.cpp:260] Setting up ctx_output1/relu_mbox_loc_flat I1106 16:38:09.122166 13608 net.cpp:267] TEST Top shape for layer 60 'ctx_output1/relu_mbox_loc_flat' 10 15360 (153600) I1106 16:38:09.122170 13608 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf' of type 'Convolution' I1106 16:38:09.122180 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.122191 13608 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf (61) I1106 16:38:09.122195 13608 net.cpp:572] ctx_output1/relu_mbox_conf <- ctx_output1_ctx_output1/relu_0_split_1 I1106 16:38:09.122198 13608 net.cpp:542] ctx_output1/relu_mbox_conf -> ctx_output1/relu_mbox_conf I1106 16:38:09.122368 13608 net.cpp:260] Setting up ctx_output1/relu_mbox_conf I1106 16:38:09.122376 13608 net.cpp:267] TEST Top shape for layer 61 'ctx_output1/relu_mbox_conf' 10 8 20 48 (76800) I1106 16:38:09.122381 13608 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:09.122385 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.122390 13608 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf_perm (62) I1106 16:38:09.122400 13608 net.cpp:572] ctx_output1/relu_mbox_conf_perm <- ctx_output1/relu_mbox_conf I1106 16:38:09.122403 13608 net.cpp:542] ctx_output1/relu_mbox_conf_perm -> ctx_output1/relu_mbox_conf_perm I1106 16:38:09.122458 13608 net.cpp:260] Setting up ctx_output1/relu_mbox_conf_perm I1106 16:38:09.122462 13608 net.cpp:267] TEST Top shape for layer 62 'ctx_output1/relu_mbox_conf_perm' 10 20 48 8 (76800) I1106 16:38:09.122467 13608 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:09.122476 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.122481 13608 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf_flat (63) I1106 16:38:09.122484 13608 net.cpp:572] ctx_output1/relu_mbox_conf_flat <- ctx_output1/relu_mbox_conf_perm I1106 16:38:09.122488 13608 net.cpp:542] ctx_output1/relu_mbox_conf_flat -> ctx_output1/relu_mbox_conf_flat I1106 16:38:09.122545 13608 net.cpp:260] Setting up ctx_output1/relu_mbox_conf_flat I1106 16:38:09.122548 13608 net.cpp:267] TEST Top shape for layer 63 'ctx_output1/relu_mbox_conf_flat' 10 7680 (76800) I1106 16:38:09.122551 13608 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:09.122561 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.122573 13608 net.cpp:200] Created Layer ctx_output1/relu_mbox_priorbox (64) I1106 16:38:09.122576 13608 net.cpp:572] ctx_output1/relu_mbox_priorbox <- ctx_output1_ctx_output1/relu_0_split_2 I1106 16:38:09.122581 13608 net.cpp:572] ctx_output1/relu_mbox_priorbox <- data_data_0_split_1 I1106 16:38:09.122584 13608 net.cpp:542] ctx_output1/relu_mbox_priorbox -> ctx_output1/relu_mbox_priorbox I1106 16:38:09.122601 13608 net.cpp:260] Setting up ctx_output1/relu_mbox_priorbox I1106 16:38:09.122606 13608 net.cpp:267] TEST Top shape for layer 64 'ctx_output1/relu_mbox_priorbox' 1 2 15360 (30720) I1106 16:38:09.122609 13608 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc' of type 'Convolution' I1106 16:38:09.122611 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.122625 13608 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc (65) I1106 16:38:09.122629 13608 net.cpp:572] ctx_output2/relu_mbox_loc <- ctx_output2_ctx_output2/relu_0_split_0 I1106 16:38:09.122632 13608 net.cpp:542] ctx_output2/relu_mbox_loc -> ctx_output2/relu_mbox_loc I1106 16:38:09.122838 13608 net.cpp:260] Setting up ctx_output2/relu_mbox_loc I1106 16:38:09.122843 13608 net.cpp:267] TEST Top shape for layer 65 'ctx_output2/relu_mbox_loc' 10 24 10 24 (57600) I1106 16:38:09.122848 13608 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:09.122856 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.122864 13608 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc_perm (66) I1106 16:38:09.122867 13608 net.cpp:572] ctx_output2/relu_mbox_loc_perm <- ctx_output2/relu_mbox_loc I1106 16:38:09.122870 13608 net.cpp:542] ctx_output2/relu_mbox_loc_perm -> ctx_output2/relu_mbox_loc_perm I1106 16:38:09.122927 13608 net.cpp:260] Setting up ctx_output2/relu_mbox_loc_perm I1106 16:38:09.122932 13608 net.cpp:267] TEST Top shape for layer 66 'ctx_output2/relu_mbox_loc_perm' 10 10 24 24 (57600) I1106 16:38:09.122934 13608 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:09.122942 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.122947 13608 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc_flat (67) I1106 16:38:09.122951 13608 net.cpp:572] ctx_output2/relu_mbox_loc_flat <- ctx_output2/relu_mbox_loc_perm I1106 16:38:09.122953 13608 net.cpp:542] ctx_output2/relu_mbox_loc_flat -> ctx_output2/relu_mbox_loc_flat I1106 16:38:09.122997 13608 net.cpp:260] Setting up ctx_output2/relu_mbox_loc_flat I1106 16:38:09.123001 13608 net.cpp:267] TEST Top shape for layer 67 'ctx_output2/relu_mbox_loc_flat' 10 5760 (57600) I1106 16:38:09.123005 13608 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf' of type 'Convolution' I1106 16:38:09.123008 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.123016 13608 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf (68) I1106 16:38:09.123019 13608 net.cpp:572] ctx_output2/relu_mbox_conf <- ctx_output2_ctx_output2/relu_0_split_1 I1106 16:38:09.123029 13608 net.cpp:542] ctx_output2/relu_mbox_conf -> ctx_output2/relu_mbox_conf I1106 16:38:09.123198 13608 net.cpp:260] Setting up ctx_output2/relu_mbox_conf I1106 16:38:09.123203 13608 net.cpp:267] TEST Top shape for layer 68 'ctx_output2/relu_mbox_conf' 10 12 10 24 (28800) I1106 16:38:09.123208 13608 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:09.123212 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.123217 13608 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf_perm (69) I1106 16:38:09.123220 13608 net.cpp:572] ctx_output2/relu_mbox_conf_perm <- ctx_output2/relu_mbox_conf I1106 16:38:09.123224 13608 net.cpp:542] ctx_output2/relu_mbox_conf_perm -> ctx_output2/relu_mbox_conf_perm I1106 16:38:09.123287 13608 net.cpp:260] Setting up ctx_output2/relu_mbox_conf_perm I1106 16:38:09.123292 13608 net.cpp:267] TEST Top shape for layer 69 'ctx_output2/relu_mbox_conf_perm' 10 10 24 12 (28800) I1106 16:38:09.123294 13608 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:09.123297 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.123301 13608 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf_flat (70) I1106 16:38:09.123304 13608 net.cpp:572] ctx_output2/relu_mbox_conf_flat <- ctx_output2/relu_mbox_conf_perm I1106 16:38:09.123306 13608 net.cpp:542] ctx_output2/relu_mbox_conf_flat -> ctx_output2/relu_mbox_conf_flat I1106 16:38:09.123343 13608 net.cpp:260] Setting up ctx_output2/relu_mbox_conf_flat I1106 16:38:09.123348 13608 net.cpp:267] TEST Top shape for layer 70 'ctx_output2/relu_mbox_conf_flat' 10 2880 (28800) I1106 16:38:09.123351 13608 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:09.123354 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.123358 13608 net.cpp:200] Created Layer ctx_output2/relu_mbox_priorbox (71) I1106 16:38:09.123363 13608 net.cpp:572] ctx_output2/relu_mbox_priorbox <- ctx_output2_ctx_output2/relu_0_split_2 I1106 16:38:09.123365 13608 net.cpp:572] ctx_output2/relu_mbox_priorbox <- data_data_0_split_2 I1106 16:38:09.123368 13608 net.cpp:542] ctx_output2/relu_mbox_priorbox -> ctx_output2/relu_mbox_priorbox I1106 16:38:09.123384 13608 net.cpp:260] Setting up ctx_output2/relu_mbox_priorbox I1106 16:38:09.123389 13608 net.cpp:267] TEST Top shape for layer 71 'ctx_output2/relu_mbox_priorbox' 1 2 5760 (11520) I1106 16:38:09.123390 13608 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc' of type 'Convolution' I1106 16:38:09.123394 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.123402 13608 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc (72) I1106 16:38:09.123405 13608 net.cpp:572] ctx_output3/relu_mbox_loc <- ctx_output3_ctx_output3/relu_0_split_0 I1106 16:38:09.123409 13608 net.cpp:542] ctx_output3/relu_mbox_loc -> ctx_output3/relu_mbox_loc I1106 16:38:09.123603 13608 net.cpp:260] Setting up ctx_output3/relu_mbox_loc I1106 16:38:09.123610 13608 net.cpp:267] TEST Top shape for layer 72 'ctx_output3/relu_mbox_loc' 10 24 5 12 (14400) I1106 16:38:09.123615 13608 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:09.123617 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.123622 13608 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc_perm (73) I1106 16:38:09.123625 13608 net.cpp:572] ctx_output3/relu_mbox_loc_perm <- ctx_output3/relu_mbox_loc I1106 16:38:09.123628 13608 net.cpp:542] ctx_output3/relu_mbox_loc_perm -> ctx_output3/relu_mbox_loc_perm I1106 16:38:09.123695 13608 net.cpp:260] Setting up ctx_output3/relu_mbox_loc_perm I1106 16:38:09.123700 13608 net.cpp:267] TEST Top shape for layer 73 'ctx_output3/relu_mbox_loc_perm' 10 5 12 24 (14400) I1106 16:38:09.123710 13608 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:09.123713 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.123716 13608 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc_flat (74) I1106 16:38:09.123720 13608 net.cpp:572] ctx_output3/relu_mbox_loc_flat <- ctx_output3/relu_mbox_loc_perm I1106 16:38:09.123723 13608 net.cpp:542] ctx_output3/relu_mbox_loc_flat -> ctx_output3/relu_mbox_loc_flat I1106 16:38:09.123765 13608 net.cpp:260] Setting up ctx_output3/relu_mbox_loc_flat I1106 16:38:09.123769 13608 net.cpp:267] TEST Top shape for layer 74 'ctx_output3/relu_mbox_loc_flat' 10 1440 (14400) I1106 16:38:09.123773 13608 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf' of type 'Convolution' I1106 16:38:09.123775 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.123783 13608 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf (75) I1106 16:38:09.123787 13608 net.cpp:572] ctx_output3/relu_mbox_conf <- ctx_output3_ctx_output3/relu_0_split_1 I1106 16:38:09.123790 13608 net.cpp:542] ctx_output3/relu_mbox_conf -> ctx_output3/relu_mbox_conf I1106 16:38:09.123953 13608 net.cpp:260] Setting up ctx_output3/relu_mbox_conf I1106 16:38:09.123960 13608 net.cpp:267] TEST Top shape for layer 75 'ctx_output3/relu_mbox_conf' 10 12 5 12 (7200) I1106 16:38:09.123965 13608 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:09.123967 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.123972 13608 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf_perm (76) I1106 16:38:09.123982 13608 net.cpp:572] ctx_output3/relu_mbox_conf_perm <- ctx_output3/relu_mbox_conf I1106 16:38:09.123986 13608 net.cpp:542] ctx_output3/relu_mbox_conf_perm -> ctx_output3/relu_mbox_conf_perm I1106 16:38:09.124042 13608 net.cpp:260] Setting up ctx_output3/relu_mbox_conf_perm I1106 16:38:09.124047 13608 net.cpp:267] TEST Top shape for layer 76 'ctx_output3/relu_mbox_conf_perm' 10 5 12 12 (7200) I1106 16:38:09.124049 13608 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:09.124052 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.124056 13608 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf_flat (77) I1106 16:38:09.124059 13608 net.cpp:572] ctx_output3/relu_mbox_conf_flat <- ctx_output3/relu_mbox_conf_perm I1106 16:38:09.124063 13608 net.cpp:542] ctx_output3/relu_mbox_conf_flat -> ctx_output3/relu_mbox_conf_flat I1106 16:38:09.124099 13608 net.cpp:260] Setting up ctx_output3/relu_mbox_conf_flat I1106 16:38:09.124104 13608 net.cpp:267] TEST Top shape for layer 77 'ctx_output3/relu_mbox_conf_flat' 10 720 (7200) I1106 16:38:09.124107 13608 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:09.124110 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.124114 13608 net.cpp:200] Created Layer ctx_output3/relu_mbox_priorbox (78) I1106 16:38:09.124116 13608 net.cpp:572] ctx_output3/relu_mbox_priorbox <- ctx_output3_ctx_output3/relu_0_split_2 I1106 16:38:09.124119 13608 net.cpp:572] ctx_output3/relu_mbox_priorbox <- data_data_0_split_3 I1106 16:38:09.124123 13608 net.cpp:542] ctx_output3/relu_mbox_priorbox -> ctx_output3/relu_mbox_priorbox I1106 16:38:09.124138 13608 net.cpp:260] Setting up ctx_output3/relu_mbox_priorbox I1106 16:38:09.124141 13608 net.cpp:267] TEST Top shape for layer 78 'ctx_output3/relu_mbox_priorbox' 1 2 1440 (2880) I1106 16:38:09.124145 13608 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc' of type 'Convolution' I1106 16:38:09.124147 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.124156 13608 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc (79) I1106 16:38:09.124166 13608 net.cpp:572] ctx_output4/relu_mbox_loc <- ctx_output4_ctx_output4/relu_0_split_0 I1106 16:38:09.124168 13608 net.cpp:542] ctx_output4/relu_mbox_loc -> ctx_output4/relu_mbox_loc I1106 16:38:09.124339 13608 net.cpp:260] Setting up ctx_output4/relu_mbox_loc I1106 16:38:09.124346 13608 net.cpp:267] TEST Top shape for layer 79 'ctx_output4/relu_mbox_loc' 10 16 3 6 (2880) I1106 16:38:09.124357 13608 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:09.124364 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.124373 13608 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc_perm (80) I1106 16:38:09.124379 13608 net.cpp:572] ctx_output4/relu_mbox_loc_perm <- ctx_output4/relu_mbox_loc I1106 16:38:09.124387 13608 net.cpp:542] ctx_output4/relu_mbox_loc_perm -> ctx_output4/relu_mbox_loc_perm I1106 16:38:09.124444 13608 net.cpp:260] Setting up ctx_output4/relu_mbox_loc_perm I1106 16:38:09.124454 13608 net.cpp:267] TEST Top shape for layer 80 'ctx_output4/relu_mbox_loc_perm' 10 3 6 16 (2880) I1106 16:38:09.124461 13608 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:09.124466 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.124472 13608 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc_flat (81) I1106 16:38:09.124480 13608 net.cpp:572] ctx_output4/relu_mbox_loc_flat <- ctx_output4/relu_mbox_loc_perm I1106 16:38:09.124485 13608 net.cpp:542] ctx_output4/relu_mbox_loc_flat -> ctx_output4/relu_mbox_loc_flat I1106 16:38:09.124526 13608 net.cpp:260] Setting up ctx_output4/relu_mbox_loc_flat I1106 16:38:09.124536 13608 net.cpp:267] TEST Top shape for layer 81 'ctx_output4/relu_mbox_loc_flat' 10 288 (2880) I1106 16:38:09.124541 13608 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf' of type 'Convolution' I1106 16:38:09.124547 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.124562 13608 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf (82) I1106 16:38:09.124567 13608 net.cpp:572] ctx_output4/relu_mbox_conf <- ctx_output4_ctx_output4/relu_0_split_1 I1106 16:38:09.124573 13608 net.cpp:542] ctx_output4/relu_mbox_conf -> ctx_output4/relu_mbox_conf I1106 16:38:09.124738 13608 net.cpp:260] Setting up ctx_output4/relu_mbox_conf I1106 16:38:09.124749 13608 net.cpp:267] TEST Top shape for layer 82 'ctx_output4/relu_mbox_conf' 10 8 3 6 (1440) I1106 16:38:09.124756 13608 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:09.124761 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.124771 13608 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf_perm (83) I1106 16:38:09.124778 13608 net.cpp:572] ctx_output4/relu_mbox_conf_perm <- ctx_output4/relu_mbox_conf I1106 16:38:09.124783 13608 net.cpp:542] ctx_output4/relu_mbox_conf_perm -> ctx_output4/relu_mbox_conf_perm I1106 16:38:09.124848 13608 net.cpp:260] Setting up ctx_output4/relu_mbox_conf_perm I1106 16:38:09.124857 13608 net.cpp:267] TEST Top shape for layer 83 'ctx_output4/relu_mbox_conf_perm' 10 3 6 8 (1440) I1106 16:38:09.124863 13608 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:09.124869 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.124876 13608 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf_flat (84) I1106 16:38:09.124881 13608 net.cpp:572] ctx_output4/relu_mbox_conf_flat <- ctx_output4/relu_mbox_conf_perm I1106 16:38:09.124888 13608 net.cpp:542] ctx_output4/relu_mbox_conf_flat -> ctx_output4/relu_mbox_conf_flat I1106 16:38:09.124928 13608 net.cpp:260] Setting up ctx_output4/relu_mbox_conf_flat I1106 16:38:09.124938 13608 net.cpp:267] TEST Top shape for layer 84 'ctx_output4/relu_mbox_conf_flat' 10 144 (1440) I1106 16:38:09.124953 13608 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:09.124958 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.124966 13608 net.cpp:200] Created Layer ctx_output4/relu_mbox_priorbox (85) I1106 16:38:09.124974 13608 net.cpp:572] ctx_output4/relu_mbox_priorbox <- ctx_output4_ctx_output4/relu_0_split_2 I1106 16:38:09.124980 13608 net.cpp:572] ctx_output4/relu_mbox_priorbox <- data_data_0_split_4 I1106 16:38:09.124986 13608 net.cpp:542] ctx_output4/relu_mbox_priorbox -> ctx_output4/relu_mbox_priorbox I1106 16:38:09.125005 13608 net.cpp:260] Setting up ctx_output4/relu_mbox_priorbox I1106 16:38:09.125013 13608 net.cpp:267] TEST Top shape for layer 85 'ctx_output4/relu_mbox_priorbox' 1 2 288 (576) I1106 16:38:09.125020 13608 layer_factory.hpp:172] Creating layer 'mbox_loc' of type 'Concat' I1106 16:38:09.125025 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.125035 13608 net.cpp:200] Created Layer mbox_loc (86) I1106 16:38:09.125041 13608 net.cpp:572] mbox_loc <- ctx_output1/relu_mbox_loc_flat I1106 16:38:09.125047 13608 net.cpp:572] mbox_loc <- ctx_output2/relu_mbox_loc_flat I1106 16:38:09.125054 13608 net.cpp:572] mbox_loc <- ctx_output3/relu_mbox_loc_flat I1106 16:38:09.125059 13608 net.cpp:572] mbox_loc <- ctx_output4/relu_mbox_loc_flat I1106 16:38:09.125066 13608 net.cpp:542] mbox_loc -> mbox_loc I1106 16:38:09.125084 13608 net.cpp:260] Setting up mbox_loc I1106 16:38:09.125092 13608 net.cpp:267] TEST Top shape for layer 86 'mbox_loc' 10 22848 (228480) I1106 16:38:09.125097 13608 layer_factory.hpp:172] Creating layer 'mbox_conf' of type 'Concat' I1106 16:38:09.125098 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.125102 13608 net.cpp:200] Created Layer mbox_conf (87) I1106 16:38:09.125109 13608 net.cpp:572] mbox_conf <- ctx_output1/relu_mbox_conf_flat I1106 16:38:09.125116 13608 net.cpp:572] mbox_conf <- ctx_output2/relu_mbox_conf_flat I1106 16:38:09.125123 13608 net.cpp:572] mbox_conf <- ctx_output3/relu_mbox_conf_flat I1106 16:38:09.125129 13608 net.cpp:572] mbox_conf <- ctx_output4/relu_mbox_conf_flat I1106 16:38:09.125134 13608 net.cpp:542] mbox_conf -> mbox_conf I1106 16:38:09.125149 13608 net.cpp:260] Setting up mbox_conf I1106 16:38:09.125152 13608 net.cpp:267] TEST Top shape for layer 87 'mbox_conf' 10 11424 (114240) I1106 16:38:09.125154 13608 layer_factory.hpp:172] Creating layer 'mbox_priorbox' of type 'Concat' I1106 16:38:09.125156 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.125161 13608 net.cpp:200] Created Layer mbox_priorbox (88) I1106 16:38:09.125164 13608 net.cpp:572] mbox_priorbox <- ctx_output1/relu_mbox_priorbox I1106 16:38:09.125166 13608 net.cpp:572] mbox_priorbox <- ctx_output2/relu_mbox_priorbox I1106 16:38:09.125174 13608 net.cpp:572] mbox_priorbox <- ctx_output3/relu_mbox_priorbox I1106 16:38:09.125180 13608 net.cpp:572] mbox_priorbox <- ctx_output4/relu_mbox_priorbox I1106 16:38:09.125186 13608 net.cpp:542] mbox_priorbox -> mbox_priorbox I1106 16:38:09.125205 13608 net.cpp:260] Setting up mbox_priorbox I1106 16:38:09.125212 13608 net.cpp:267] TEST Top shape for layer 88 'mbox_priorbox' 1 2 22848 (45696) I1106 16:38:09.125218 13608 layer_factory.hpp:172] Creating layer 'mbox_conf_reshape' of type 'Reshape' I1106 16:38:09.125224 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.125236 13608 net.cpp:200] Created Layer mbox_conf_reshape (89) I1106 16:38:09.125243 13608 net.cpp:572] mbox_conf_reshape <- mbox_conf I1106 16:38:09.125249 13608 net.cpp:542] mbox_conf_reshape -> mbox_conf_reshape I1106 16:38:09.125270 13608 net.cpp:260] Setting up mbox_conf_reshape I1106 16:38:09.125279 13608 net.cpp:267] TEST Top shape for layer 89 'mbox_conf_reshape' 10 5712 2 (114240) I1106 16:38:09.125288 13608 layer_factory.hpp:172] Creating layer 'mbox_conf_softmax' of type 'Softmax' I1106 16:38:09.125298 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.125314 13608 net.cpp:200] Created Layer mbox_conf_softmax (90) I1106 16:38:09.125321 13608 net.cpp:572] mbox_conf_softmax <- mbox_conf_reshape I1106 16:38:09.125329 13608 net.cpp:542] mbox_conf_softmax -> mbox_conf_softmax I1106 16:38:09.125370 13608 net.cpp:260] Setting up mbox_conf_softmax I1106 16:38:09.125377 13608 net.cpp:267] TEST Top shape for layer 90 'mbox_conf_softmax' 10 5712 2 (114240) I1106 16:38:09.125385 13608 layer_factory.hpp:172] Creating layer 'mbox_conf_flatten' of type 'Flatten' I1106 16:38:09.125391 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.125397 13608 net.cpp:200] Created Layer mbox_conf_flatten (91) I1106 16:38:09.125403 13608 net.cpp:572] mbox_conf_flatten <- mbox_conf_softmax I1106 16:38:09.125411 13608 net.cpp:542] mbox_conf_flatten -> mbox_conf_flatten I1106 16:38:09.125479 13608 net.cpp:260] Setting up mbox_conf_flatten I1106 16:38:09.125489 13608 net.cpp:267] TEST Top shape for layer 91 'mbox_conf_flatten' 10 11424 (114240) I1106 16:38:09.125495 13608 layer_factory.hpp:172] Creating layer 'detection_out' of type 'DetectionOutput' I1106 16:38:09.125502 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.125521 13608 net.cpp:200] Created Layer detection_out (92) I1106 16:38:09.125527 13608 net.cpp:572] detection_out <- mbox_loc I1106 16:38:09.125535 13608 net.cpp:572] detection_out <- mbox_conf_flatten I1106 16:38:09.125542 13608 net.cpp:572] detection_out <- mbox_priorbox I1106 16:38:09.125548 13608 net.cpp:542] detection_out -> detection_out I1106 16:38:09.125655 13608 net.cpp:260] Setting up detection_out I1106 16:38:09.125667 13608 net.cpp:267] TEST Top shape for layer 92 'detection_out' 1 1 1 7 (7) I1106 16:38:09.125674 13608 layer_factory.hpp:172] Creating layer 'detection_eval' of type 'DetectionEvaluate' I1106 16:38:09.125679 13608 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.125685 13608 net.cpp:200] Created Layer detection_eval (93) I1106 16:38:09.125692 13608 net.cpp:572] detection_eval <- detection_out I1106 16:38:09.125699 13608 net.cpp:572] detection_eval <- label I1106 16:38:09.125705 13608 net.cpp:542] detection_eval -> detection_eval I1106 16:38:09.125746 13608 net.cpp:260] Setting up detection_eval I1106 16:38:09.125756 13608 net.cpp:267] TEST Top shape for layer 93 'detection_eval' 1 1 2 5 (10) I1106 16:38:09.125762 13608 net.cpp:338] detection_eval does not need backward computation. I1106 16:38:09.125768 13608 net.cpp:338] detection_out does not need backward computation. I1106 16:38:09.125774 13608 net.cpp:338] mbox_conf_flatten does not need backward computation. I1106 16:38:09.125779 13608 net.cpp:338] mbox_conf_softmax does not need backward computation. I1106 16:38:09.125785 13608 net.cpp:338] mbox_conf_reshape does not need backward computation. I1106 16:38:09.125792 13608 net.cpp:338] mbox_priorbox does not need backward computation. I1106 16:38:09.125797 13608 net.cpp:338] mbox_conf does not need backward computation. I1106 16:38:09.125803 13608 net.cpp:338] mbox_loc does not need backward computation. I1106 16:38:09.125810 13608 net.cpp:338] ctx_output4/relu_mbox_priorbox does not need backward computation. I1106 16:38:09.125818 13608 net.cpp:338] ctx_output4/relu_mbox_conf_flat does not need backward computation. I1106 16:38:09.125823 13608 net.cpp:338] ctx_output4/relu_mbox_conf_perm does not need backward computation. I1106 16:38:09.125828 13608 net.cpp:338] ctx_output4/relu_mbox_conf does not need backward computation. I1106 16:38:09.125834 13608 net.cpp:338] ctx_output4/relu_mbox_loc_flat does not need backward computation. I1106 16:38:09.125840 13608 net.cpp:338] ctx_output4/relu_mbox_loc_perm does not need backward computation. I1106 16:38:09.125849 13608 net.cpp:338] ctx_output4/relu_mbox_loc does not need backward computation. I1106 16:38:09.125859 13608 net.cpp:338] ctx_output3/relu_mbox_priorbox does not need backward computation. I1106 16:38:09.125865 13608 net.cpp:338] ctx_output3/relu_mbox_conf_flat does not need backward computation. I1106 16:38:09.125870 13608 net.cpp:338] ctx_output3/relu_mbox_conf_perm does not need backward computation. I1106 16:38:09.125874 13608 net.cpp:338] ctx_output3/relu_mbox_conf does not need backward computation. I1106 16:38:09.125875 13608 net.cpp:338] ctx_output3/relu_mbox_loc_flat does not need backward computation. I1106 16:38:09.125877 13608 net.cpp:338] ctx_output3/relu_mbox_loc_perm does not need backward computation. I1106 16:38:09.125880 13608 net.cpp:338] ctx_output3/relu_mbox_loc does not need backward computation. I1106 16:38:09.125887 13608 net.cpp:338] ctx_output2/relu_mbox_priorbox does not need backward computation. I1106 16:38:09.125893 13608 net.cpp:338] ctx_output2/relu_mbox_conf_flat does not need backward computation. I1106 16:38:09.125900 13608 net.cpp:338] ctx_output2/relu_mbox_conf_perm does not need backward computation. I1106 16:38:09.125903 13608 net.cpp:338] ctx_output2/relu_mbox_conf does not need backward computation. I1106 16:38:09.125908 13608 net.cpp:338] ctx_output2/relu_mbox_loc_flat does not need backward computation. I1106 16:38:09.125913 13608 net.cpp:338] ctx_output2/relu_mbox_loc_perm does not need backward computation. I1106 16:38:09.125919 13608 net.cpp:338] ctx_output2/relu_mbox_loc does not need backward computation. I1106 16:38:09.125924 13608 net.cpp:338] ctx_output1/relu_mbox_priorbox does not need backward computation. I1106 16:38:09.125931 13608 net.cpp:338] ctx_output1/relu_mbox_conf_flat does not need backward computation. I1106 16:38:09.125936 13608 net.cpp:338] ctx_output1/relu_mbox_conf_perm does not need backward computation. I1106 16:38:09.125941 13608 net.cpp:338] ctx_output1/relu_mbox_conf does not need backward computation. I1106 16:38:09.125946 13608 net.cpp:338] ctx_output1/relu_mbox_loc_flat does not need backward computation. I1106 16:38:09.125950 13608 net.cpp:338] ctx_output1/relu_mbox_loc_perm does not need backward computation. I1106 16:38:09.125955 13608 net.cpp:338] ctx_output1/relu_mbox_loc does not need backward computation. I1106 16:38:09.125960 13608 net.cpp:338] ctx_output5/relu does not need backward computation. I1106 16:38:09.125967 13608 net.cpp:338] ctx_output5 does not need backward computation. I1106 16:38:09.125969 13608 net.cpp:338] ctx_output4_ctx_output4/relu_0_split does not need backward computation. I1106 16:38:09.125972 13608 net.cpp:338] ctx_output4/relu does not need backward computation. I1106 16:38:09.125978 13608 net.cpp:338] ctx_output4 does not need backward computation. I1106 16:38:09.125982 13608 net.cpp:338] ctx_output3_ctx_output3/relu_0_split does not need backward computation. I1106 16:38:09.125988 13608 net.cpp:338] ctx_output3/relu does not need backward computation. I1106 16:38:09.125994 13608 net.cpp:338] ctx_output3 does not need backward computation. I1106 16:38:09.126000 13608 net.cpp:338] ctx_output2_ctx_output2/relu_0_split does not need backward computation. I1106 16:38:09.126005 13608 net.cpp:338] ctx_output2/relu does not need backward computation. I1106 16:38:09.126011 13608 net.cpp:338] ctx_output2 does not need backward computation. I1106 16:38:09.126016 13608 net.cpp:338] ctx_output1_ctx_output1/relu_0_split does not need backward computation. I1106 16:38:09.126022 13608 net.cpp:338] ctx_output1/relu does not need backward computation. I1106 16:38:09.126027 13608 net.cpp:338] ctx_output1 does not need backward computation. I1106 16:38:09.126034 13608 net.cpp:338] pool8 does not need backward computation. I1106 16:38:09.126036 13608 net.cpp:338] pool7_pool7_0_split does not need backward computation. I1106 16:38:09.126039 13608 net.cpp:338] pool7 does not need backward computation. I1106 16:38:09.126045 13608 net.cpp:338] pool6_pool6_0_split does not need backward computation. I1106 16:38:09.126049 13608 net.cpp:338] pool6 does not need backward computation. I1106 16:38:09.126055 13608 net.cpp:338] res5a_branch2b_res5a_branch2b/relu_0_split does not need backward computation. I1106 16:38:09.126062 13608 net.cpp:338] res5a_branch2b/relu does not need backward computation. I1106 16:38:09.126070 13608 net.cpp:338] res5a_branch2b/bn does not need backward computation. I1106 16:38:09.126071 13608 net.cpp:338] res5a_branch2b does not need backward computation. I1106 16:38:09.126075 13608 net.cpp:338] res5a_branch2a/relu does not need backward computation. I1106 16:38:09.126080 13608 net.cpp:338] res5a_branch2a/bn does not need backward computation. I1106 16:38:09.126083 13608 net.cpp:338] res5a_branch2a does not need backward computation. I1106 16:38:09.126085 13608 net.cpp:338] pool4 does not need backward computation. I1106 16:38:09.126087 13608 net.cpp:338] res4a_branch2b_res4a_branch2b/relu_0_split does not need backward computation. I1106 16:38:09.126096 13608 net.cpp:338] res4a_branch2b/relu does not need backward computation. I1106 16:38:09.126098 13608 net.cpp:338] res4a_branch2b/bn does not need backward computation. I1106 16:38:09.126101 13608 net.cpp:338] res4a_branch2b does not need backward computation. I1106 16:38:09.126102 13608 net.cpp:338] res4a_branch2a/relu does not need backward computation. I1106 16:38:09.126111 13608 net.cpp:338] res4a_branch2a/bn does not need backward computation. I1106 16:38:09.126113 13608 net.cpp:338] res4a_branch2a does not need backward computation. I1106 16:38:09.126116 13608 net.cpp:338] pool3 does not need backward computation. I1106 16:38:09.126118 13608 net.cpp:338] res3a_branch2b/relu does not need backward computation. I1106 16:38:09.126121 13608 net.cpp:338] res3a_branch2b/bn does not need backward computation. I1106 16:38:09.126124 13608 net.cpp:338] res3a_branch2b does not need backward computation. I1106 16:38:09.126127 13608 net.cpp:338] res3a_branch2a/relu does not need backward computation. I1106 16:38:09.126128 13608 net.cpp:338] res3a_branch2a/bn does not need backward computation. I1106 16:38:09.126132 13608 net.cpp:338] res3a_branch2a does not need backward computation. I1106 16:38:09.126133 13608 net.cpp:338] pool2 does not need backward computation. I1106 16:38:09.126137 13608 net.cpp:338] res2a_branch2b/relu does not need backward computation. I1106 16:38:09.126138 13608 net.cpp:338] res2a_branch2b/bn does not need backward computation. I1106 16:38:09.126142 13608 net.cpp:338] res2a_branch2b does not need backward computation. I1106 16:38:09.126144 13608 net.cpp:338] res2a_branch2a/relu does not need backward computation. I1106 16:38:09.126145 13608 net.cpp:338] res2a_branch2a/bn does not need backward computation. I1106 16:38:09.126148 13608 net.cpp:338] res2a_branch2a does not need backward computation. I1106 16:38:09.126152 13608 net.cpp:338] pool1 does not need backward computation. I1106 16:38:09.126153 13608 net.cpp:338] conv1b/relu does not need backward computation. I1106 16:38:09.126157 13608 net.cpp:338] conv1b/bn does not need backward computation. I1106 16:38:09.126158 13608 net.cpp:338] conv1b does not need backward computation. I1106 16:38:09.126161 13608 net.cpp:338] conv1a/relu does not need backward computation. I1106 16:38:09.126164 13608 net.cpp:338] conv1a/bn does not need backward computation. I1106 16:38:09.126166 13608 net.cpp:338] conv1a does not need backward computation. I1106 16:38:09.126168 13608 net.cpp:338] data/bias does not need backward computation. I1106 16:38:09.126173 13608 net.cpp:338] data_data_0_split does not need backward computation. I1106 16:38:09.126175 13608 net.cpp:338] data does not need backward computation. I1106 16:38:09.126176 13608 net.cpp:380] This network produces output ctx_output5 I1106 16:38:09.126179 13608 net.cpp:380] This network produces output detection_eval I1106 16:38:09.126251 13608 net.cpp:403] Top memory (TEST) required for data: 1264712584 diff: 1264712584 I1106 16:38:09.126255 13608 net.cpp:406] Bottom memory (TEST) required for data: 1264651104 diff: 1264651104 I1106 16:38:09.126257 13608 net.cpp:409] Shared (in-place) memory (TEST) by data: 622632960 diff: 622632960 I1106 16:38:09.126264 13608 net.cpp:412] Parameters memory (TEST) required for data: 11946688 diff: 11946688 I1106 16:38:09.126267 13608 net.cpp:415] Parameters shared memory (TEST) by data: 0 diff: 0 I1106 16:38:09.126268 13608 net.cpp:421] Network initialization done. F1106 16:38:09.126478 13608 io.cpp:55] Check failed: fd != -1 (-1 vs. -1) File not found: training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/sparse/ti-custom-cfg1_ssdJacintoNetV2_iter_120000.caffemodel *** Check failure stack trace: *** @ 0x7f0e02cad5cd google::LogMessage::Fail() @ 0x7f0e02caf433 google::LogMessage::SendToLog() @ 0x7f0e02cad15b google::LogMessage::Flush() @ 0x7f0e02cafe1e google::LogMessageFatal::~LogMessageFatal() @ 0x7f0e03cbb6dc caffe::ReadProtoFromBinaryFile() @ 0x7f0e03d33f56 caffe::ReadNetParamsFromBinaryFileOrDie() @ 0x7f0e0386a88a caffe::Net::CopyTrainedLayersFromBinaryProto() @ 0x7f0e0386a92e caffe::Net::CopyTrainedLayersFrom() @ 0x41204c test_detection() @ 0x40d1f0 main @ 0x7f0e0142f830 __libc_start_main @ 0x40de89 _start @ (nil) (unknown) I1106 16:38:09.589010 13633 caffe.cpp:902] This is NVCaffe 0.17.0 started at Wed Nov 6 16:38:09 2019 I1106 16:38:09.589139 13633 caffe.cpp:904] CuDNN version: 7601 I1106 16:38:09.589143 13633 caffe.cpp:905] CuBLAS version: 10201 I1106 16:38:09.589144 13633 caffe.cpp:906] CUDA version: 10010 I1106 16:38:09.589145 13633 caffe.cpp:907] CUDA driver version: 10010 I1106 16:38:09.589148 13633 caffe.cpp:908] Arguments: [0]: /home/liuyuyuan/caffe-jacinto/build/tools/caffe.bin [1]: test_detection [2]: --model=training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/test_quantize/test.prototxt [3]: --iterations=3 [4]: --weights=training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/sparse/ti-custom-cfg1_ssdJacintoNetV2_iter_120000.caffemodel [5]: --gpu [6]: 0 I1106 16:38:09.632973 13633 gpu_memory.cpp:105] GPUMemory::Manager initialized I1106 16:38:09.633428 13633 gpu_memory.cpp:107] Total memory: 6193479680, Free: 3152805888, dev_info[0]: total=6193479680 free=3152805888 I1106 16:38:09.633435 13633 caffe.cpp:406] Use GPU with device ID 0 I1106 16:38:09.633743 13633 caffe.cpp:409] GPU device name: GeForce GTX 1660 Ti I1106 16:38:09.645289 13633 net.cpp:80] Initializing net from parameters: name: "ssdJacintoNetV2_test" state { phase: TEST level: 0 } layer { name: "data" type: "AnnotatedData" top: "data" top: "label" include { phase: TEST } transform_param { mean_value: 0 mean_value: 0 mean_value: 0 force_color: false resize_param { prob: 1 resize_mode: WARP height: 320 width: 768 interp_mode: LINEAR } crop_h: 320 crop_w: 768 } data_param { source: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_test_lmdb" batch_size: 10 backend: LMDB threads: 4 parser_threads: 4 } annotated_data_param { batch_sampler { } label_map_file: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/labelmap.prototxt" } } layer { name: "data/bias" type: "Bias" bottom: "data" top: "data/bias" param { lr_mult: 0 decay_mult: 0 } bias_param { filler { type: "constant" value: -128 } } } layer { name: "conv1a" type: "Convolution" bottom: "data/bias" top: "conv1a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 32 bias_term: true pad: 2 kernel_size: 5 group: 1 stride: 2 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "conv1a/bn" type: "BatchNorm" bottom: "conv1a" top: "conv1a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "conv1a/relu" type: "ReLU" bottom: "conv1a" top: "conv1a" } layer { name: "conv1b" type: "Convolution" bottom: "conv1a" top: "conv1b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 32 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "conv1b/bn" type: "BatchNorm" bottom: "conv1b" top: "conv1b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "conv1b/relu" type: "ReLU" bottom: "conv1b" top: "conv1b" } layer { name: "pool1" type: "Pooling" bottom: "conv1b" top: "pool1" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res2a_branch2a" type: "Convolution" bottom: "pool1" top: "res2a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 64 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res2a_branch2a/bn" type: "BatchNorm" bottom: "res2a_branch2a" top: "res2a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res2a_branch2a/relu" type: "ReLU" bottom: "res2a_branch2a" top: "res2a_branch2a" } layer { name: "res2a_branch2b" type: "Convolution" bottom: "res2a_branch2a" top: "res2a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 64 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res2a_branch2b/bn" type: "BatchNorm" bottom: "res2a_branch2b" top: "res2a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res2a_branch2b/relu" type: "ReLU" bottom: "res2a_branch2b" top: "res2a_branch2b" } layer { name: "pool2" type: "Pooling" bottom: "res2a_branch2b" top: "pool2" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res3a_branch2a" type: "Convolution" bottom: "pool2" top: "res3a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 128 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res3a_branch2a/bn" type: "BatchNorm" bottom: "res3a_branch2a" top: "res3a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res3a_branch2a/relu" type: "ReLU" bottom: "res3a_branch2a" top: "res3a_branch2a" } layer { name: "res3a_branch2b" type: "Convolution" bottom: "res3a_branch2a" top: "res3a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 128 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res3a_branch2b/bn" type: "BatchNorm" bottom: "res3a_branch2b" top: "res3a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res3a_branch2b/relu" type: "ReLU" bottom: "res3a_branch2b" top: "res3a_branch2b" } layer { name: "pool3" type: "Pooling" bottom: "res3a_branch2b" top: "pool3" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res4a_branch2a" type: "Convolution" bottom: "pool3" top: "res4a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res4a_branch2a/bn" type: "BatchNorm" bottom: "res4a_branch2a" top: "res4a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res4a_branch2a/relu" type: "ReLU" bottom: "res4a_branch2a" top: "res4a_branch2a" } layer { name: "res4a_branch2b" type: "Convolution" bottom: "res4a_branch2a" top: "res4a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res4a_branch2b/bn" type: "BatchNorm" bottom: "res4a_branch2b" top: "res4a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res4a_branch2b/relu" type: "ReLU" bottom: "res4a_branch2b" top: "res4a_branch2b" } layer { name: "pool4" type: "Pooling" bottom: "res4a_branch2b" top: "pool4" pooling_param { pool: MAX kernel_size: 2 stride: 2 } } layer { name: "res5a_branch2a" type: "Convolution" bottom: "pool4" top: "res5a_branch2a" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 512 bias_term: true pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res5a_branch2a/bn" type: "BatchNorm" bottom: "res5a_branch2a" top: "res5a_branch2a" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res5a_branch2a/relu" type: "ReLU" bottom: "res5a_branch2a" top: "res5a_branch2a" } layer { name: "res5a_branch2b" type: "Convolution" bottom: "res5a_branch2a" top: "res5a_branch2b" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 512 bias_term: true pad: 1 kernel_size: 3 group: 4 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "res5a_branch2b/bn" type: "BatchNorm" bottom: "res5a_branch2b" top: "res5a_branch2b" batch_norm_param { moving_average_fraction: 0.99 eps: 0.0001 scale_bias: true } } layer { name: "res5a_branch2b/relu" type: "ReLU" bottom: "res5a_branch2b" top: "res5a_branch2b" } layer { name: "pool6" type: "Pooling" bottom: "res5a_branch2b" top: "pool6" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "pool7" type: "Pooling" bottom: "pool6" top: "pool7" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "pool8" type: "Pooling" bottom: "pool7" top: "pool8" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad: 0 } } layer { name: "ctx_output1" type: "Convolution" bottom: "res4a_branch2b" top: "ctx_output1" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu" type: "ReLU" bottom: "ctx_output1" top: "ctx_output1" } layer { name: "ctx_output2" type: "Convolution" bottom: "res5a_branch2b" top: "ctx_output2" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu" type: "ReLU" bottom: "ctx_output2" top: "ctx_output2" } layer { name: "ctx_output3" type: "Convolution" bottom: "pool6" top: "ctx_output3" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu" type: "ReLU" bottom: "ctx_output3" top: "ctx_output3" } layer { name: "ctx_output4" type: "Convolution" bottom: "pool7" top: "ctx_output4" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu" type: "ReLU" bottom: "ctx_output4" top: "ctx_output4" } layer { name: "ctx_output5" type: "Convolution" bottom: "pool8" top: "ctx_output5" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output5/relu" type: "ReLU" bottom: "ctx_output5" top: "ctx_output5" } layer { name: "ctx_output1/relu_mbox_loc" type: "Convolution" bottom: "ctx_output1" top: "ctx_output1/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 16 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output1/relu_mbox_loc" top: "ctx_output1/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output1/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output1/relu_mbox_loc_perm" top: "ctx_output1/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output1/relu_mbox_conf" type: "Convolution" bottom: "ctx_output1" top: "ctx_output1/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 8 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output1/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output1/relu_mbox_conf" top: "ctx_output1/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output1/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output1/relu_mbox_conf_perm" top: "ctx_output1/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output1/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output1" bottom: "data" top: "ctx_output1/relu_mbox_priorbox" prior_box_param { min_size: 14.72 max_size: 36.8 aspect_ratio: 2 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output2/relu_mbox_loc" type: "Convolution" bottom: "ctx_output2" top: "ctx_output2/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 24 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output2/relu_mbox_loc" top: "ctx_output2/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output2/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output2/relu_mbox_loc_perm" top: "ctx_output2/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output2/relu_mbox_conf" type: "Convolution" bottom: "ctx_output2" top: "ctx_output2/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 12 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output2/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output2/relu_mbox_conf" top: "ctx_output2/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output2/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output2/relu_mbox_conf_perm" top: "ctx_output2/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output2/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output2" bottom: "data" top: "ctx_output2/relu_mbox_priorbox" prior_box_param { min_size: 36.8 max_size: 132.48 aspect_ratio: 2 aspect_ratio: 3 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output3/relu_mbox_loc" type: "Convolution" bottom: "ctx_output3" top: "ctx_output3/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 24 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output3/relu_mbox_loc" top: "ctx_output3/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output3/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output3/relu_mbox_loc_perm" top: "ctx_output3/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output3/relu_mbox_conf" type: "Convolution" bottom: "ctx_output3" top: "ctx_output3/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 12 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output3/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output3/relu_mbox_conf" top: "ctx_output3/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output3/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output3/relu_mbox_conf_perm" top: "ctx_output3/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output3/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output3" bottom: "data" top: "ctx_output3/relu_mbox_priorbox" prior_box_param { min_size: 132.48 max_size: 228.16 aspect_ratio: 2 aspect_ratio: 3 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "ctx_output4/relu_mbox_loc" type: "Convolution" bottom: "ctx_output4" top: "ctx_output4/relu_mbox_loc" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 16 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu_mbox_loc_perm" type: "Permute" bottom: "ctx_output4/relu_mbox_loc" top: "ctx_output4/relu_mbox_loc_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output4/relu_mbox_loc_flat" type: "Flatten" bottom: "ctx_output4/relu_mbox_loc_perm" top: "ctx_output4/relu_mbox_loc_flat" flatten_param { axis: 1 } } layer { name: "ctx_output4/relu_mbox_conf" type: "Convolution" bottom: "ctx_output4" top: "ctx_output4/relu_mbox_conf" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 8 bias_term: true pad: 0 kernel_size: 1 group: 1 stride: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } dilation: 1 } } layer { name: "ctx_output4/relu_mbox_conf_perm" type: "Permute" bottom: "ctx_output4/relu_mbox_conf" top: "ctx_output4/relu_mbox_conf_perm" permute_param { order: 0 order: 2 order: 3 order: 1 } } layer { name: "ctx_output4/relu_mbox_conf_flat" type: "Flatten" bottom: "ctx_output4/relu_mbox_conf_perm" top: "ctx_output4/relu_mbox_conf_flat" flatten_param { axis: 1 } } layer { name: "ctx_output4/relu_mbox_priorbox" type: "PriorBox" bottom: "ctx_output4" bottom: "data" top: "ctx_output4/relu_mbox_priorbox" prior_box_param { min_size: 228.16 max_size: 323.84 aspect_ratio: 2 flip: true clip: false variance: 0.1 variance: 0.1 variance: 0.2 variance: 0.2 offset: 0.5 } } layer { name: "mbox_loc" type: "Concat" bottom: "ctx_output1/relu_mbox_loc_flat" bottom: "ctx_output2/relu_mbox_loc_flat" bottom: "ctx_output3/relu_mbox_loc_flat" bottom: "ctx_output4/relu_mbox_loc_flat" top: "mbox_loc" concat_param { axis: 1 } } layer { name: "mbox_conf" type: "Concat" bottom: "ctx_output1/relu_mbox_conf_flat" bottom: "ctx_output2/relu_mbox_conf_flat" bottom: "ctx_output3/relu_mbox_conf_flat" bottom: "ctx_output4/relu_mbox_conf_flat" top: "mbox_conf" concat_param { axis: 1 } } layer { name: "mbox_priorbox" type: "Concat" bottom: "ctx_output1/relu_mbox_priorbox" bottom: "ctx_output2/relu_mbox_priorbox" bottom: "ctx_output3/relu_mbox_priorbox" bottom: "ctx_output4/relu_mbox_priorbox" top: "mbox_priorbox" concat_param { axis: 2 } } layer { name: "mbox_conf_reshape" type: "Reshape" bottom: "mbox_conf" top: "mbox_conf_reshape" reshape_param { shape { dim: 0 dim: -1 dim: 2 } } } layer { name: "mbox_conf_softmax" type: "Softmax" bottom: "mbox_conf_reshape" top: "mbox_conf_softmax" softmax_param { axis: 2 } } layer { name: "mbox_conf_flatten" type: "Flatten" bottom: "mbox_conf_softmax" top: "mbox_conf_flatten" flatten_param { axis: 1 } } layer { name: "detection_out" type: "DetectionOutput" bottom: "mbox_loc" bottom: "mbox_conf_flatten" bottom: "mbox_priorbox" top: "detection_out" include { phase: TEST } detection_output_param { num_classes: 2 share_location: true background_label_id: 0 nms_param { nms_threshold: 0.45 top_k: 400 } save_output_param { output_directory: "" output_name_prefix: "comp4_det_test_" output_format: "VOC" label_map_file: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/labelmap.prototxt" name_size_file: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/test_name_size.txt" num_test_image: 24 } code_type: CENTER_SIZE keep_top_k: 200 confidence_threshold: 0.01 } } layer { name: "detection_eval" type: "DetectionEvaluate" bottom: "detection_out" bottom: "label" top: "detection_eval" include { phase: TEST } detection_evaluate_param { num_classes: 2 background_label_id: 0 overlap_threshold: 0.5 evaluate_difficult_gt: false name_size_file: "/home/liuyuyuan/caffe-jacinto/data/helmet_detection/test_name_size.txt" } } quantize: true I1106 16:38:09.645622 13633 net.cpp:110] Using FLOAT as default forward math type I1106 16:38:09.645640 13633 net.cpp:116] Using FLOAT as default backward math type I1106 16:38:09.645648 13633 layer_factory.hpp:172] Creating layer 'data' of type 'AnnotatedData' I1106 16:38:09.645651 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.645745 13633 internal_thread.cpp:19] Starting 1 internal thread(s) on device 0 I1106 16:38:09.646196 13646 blocking_queue.cpp:40] Data layer prefetch queue empty I1106 16:38:09.646209 13633 net.cpp:200] Created Layer data (0) I1106 16:38:09.646217 13633 net.cpp:542] data -> data I1106 16:38:09.646232 13633 net.cpp:542] data -> label I1106 16:38:09.646252 13633 data_reader.cpp:58] Data Reader threads: 1, out queues: 1, depth: 10 I1106 16:38:09.646270 13633 internal_thread.cpp:19] Starting 1 internal thread(s) on device 0 I1106 16:38:09.646678 13647 db_lmdb.cpp:36] Opened lmdb /home/liuyuyuan/caffe-jacinto/data/helmet_detection/mydata/lmdb/mydata_test_lmdb I1106 16:38:09.648103 13633 annotated_data_layer.cpp:105] output data size: 10,3,320,768 I1106 16:38:09.648174 13633 annotated_data_layer.cpp:150] (0) Output data size: 10, 3, 320, 768 I1106 16:38:09.648217 13633 internal_thread.cpp:19] Starting 1 internal thread(s) on device 0 I1106 16:38:09.648267 13633 net.cpp:260] Setting up data I1106 16:38:09.648293 13633 net.cpp:267] TEST Top shape for layer 0 'data' 10 3 320 768 (7372800) I1106 16:38:09.648308 13633 net.cpp:267] TEST Top shape for layer 0 'data' 1 1 2 8 (16) I1106 16:38:09.648609 13633 layer_factory.hpp:172] Creating layer 'data_data_0_split' of type 'Split' I1106 16:38:09.648614 13648 data_layer.cpp:105] (0) Parser threads: 1 I1106 16:38:09.648623 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.648633 13648 data_layer.cpp:107] (0) Transformer threads: 1 I1106 16:38:09.648665 13633 net.cpp:200] Created Layer data_data_0_split (1) I1106 16:38:09.655884 13633 net.cpp:572] data_data_0_split <- data I1106 16:38:09.655969 13633 net.cpp:542] data_data_0_split -> data_data_0_split_0 I1106 16:38:09.655992 13633 net.cpp:542] data_data_0_split -> data_data_0_split_1 I1106 16:38:09.656002 13633 net.cpp:542] data_data_0_split -> data_data_0_split_2 I1106 16:38:09.656010 13633 net.cpp:542] data_data_0_split -> data_data_0_split_3 I1106 16:38:09.656018 13633 net.cpp:542] data_data_0_split -> data_data_0_split_4 I1106 16:38:09.657176 13633 net.cpp:260] Setting up data_data_0_split I1106 16:38:09.657215 13633 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 10 3 320 768 (7372800) I1106 16:38:09.657224 13633 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 10 3 320 768 (7372800) I1106 16:38:09.657232 13633 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 10 3 320 768 (7372800) I1106 16:38:09.657238 13633 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 10 3 320 768 (7372800) I1106 16:38:09.657244 13633 net.cpp:267] TEST Top shape for layer 1 'data_data_0_split' 10 3 320 768 (7372800) I1106 16:38:09.657253 13633 layer_factory.hpp:172] Creating layer 'data/bias' of type 'Bias' I1106 16:38:09.657263 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.657277 13633 net.cpp:200] Created Layer data/bias (2) I1106 16:38:09.657284 13633 net.cpp:572] data/bias <- data_data_0_split_0 I1106 16:38:09.657294 13633 net.cpp:542] data/bias -> data/bias I1106 16:38:09.657471 13633 net.cpp:260] Setting up data/bias I1106 16:38:09.657481 13633 net.cpp:267] TEST Top shape for layer 2 'data/bias' 10 3 320 768 (7372800) I1106 16:38:09.657505 13633 layer_factory.hpp:172] Creating layer 'conv1a' of type 'Convolution' I1106 16:38:09.657513 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:09.657542 13633 net.cpp:200] Created Layer conv1a (3) I1106 16:38:09.657550 13633 net.cpp:572] conv1a <- data/bias I1106 16:38:09.657557 13633 net.cpp:542] conv1a -> conv1a I1106 16:38:09.771536 13647 data_reader.cpp:320] Restarting data pre-fetching I1106 16:38:10.891883 13633 net.cpp:260] Setting up conv1a I1106 16:38:10.891907 13633 net.cpp:267] TEST Top shape for layer 3 'conv1a' 10 32 160 384 (19660800) I1106 16:38:10.891918 13633 layer_factory.hpp:172] Creating layer 'conv1a/bn' of type 'BatchNorm' I1106 16:38:10.891943 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.891959 13633 net.cpp:200] Created Layer conv1a/bn (4) I1106 16:38:10.891964 13633 net.cpp:572] conv1a/bn <- conv1a I1106 16:38:10.891968 13633 net.cpp:527] conv1a/bn -> conv1a (in-place) I1106 16:38:10.892262 13633 net.cpp:260] Setting up conv1a/bn I1106 16:38:10.892268 13633 net.cpp:267] TEST Top shape for layer 4 'conv1a/bn' 10 32 160 384 (19660800) I1106 16:38:10.892292 13633 layer_factory.hpp:172] Creating layer 'conv1a/relu' of type 'ReLU' I1106 16:38:10.892295 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.892299 13633 net.cpp:200] Created Layer conv1a/relu (5) I1106 16:38:10.892302 13633 net.cpp:572] conv1a/relu <- conv1a I1106 16:38:10.892304 13633 net.cpp:527] conv1a/relu -> conv1a (in-place) I1106 16:38:10.892313 13633 net.cpp:260] Setting up conv1a/relu I1106 16:38:10.892316 13633 net.cpp:267] TEST Top shape for layer 5 'conv1a/relu' 10 32 160 384 (19660800) I1106 16:38:10.892319 13633 layer_factory.hpp:172] Creating layer 'conv1b' of type 'Convolution' I1106 16:38:10.892321 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.892329 13633 net.cpp:200] Created Layer conv1b (6) I1106 16:38:10.892333 13633 net.cpp:572] conv1b <- conv1a I1106 16:38:10.892335 13633 net.cpp:542] conv1b -> conv1b I1106 16:38:10.893082 13633 net.cpp:260] Setting up conv1b I1106 16:38:10.893090 13633 net.cpp:267] TEST Top shape for layer 6 'conv1b' 10 32 160 384 (19660800) I1106 16:38:10.893095 13633 layer_factory.hpp:172] Creating layer 'conv1b/bn' of type 'BatchNorm' I1106 16:38:10.893097 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.893124 13633 net.cpp:200] Created Layer conv1b/bn (7) I1106 16:38:10.893127 13633 net.cpp:572] conv1b/bn <- conv1b I1106 16:38:10.893129 13633 net.cpp:527] conv1b/bn -> conv1b (in-place) I1106 16:38:10.893364 13633 net.cpp:260] Setting up conv1b/bn I1106 16:38:10.893369 13633 net.cpp:267] TEST Top shape for layer 7 'conv1b/bn' 10 32 160 384 (19660800) I1106 16:38:10.893374 13633 layer_factory.hpp:172] Creating layer 'conv1b/relu' of type 'ReLU' I1106 16:38:10.893376 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.893380 13633 net.cpp:200] Created Layer conv1b/relu (8) I1106 16:38:10.893383 13633 net.cpp:572] conv1b/relu <- conv1b I1106 16:38:10.893384 13633 net.cpp:527] conv1b/relu -> conv1b (in-place) I1106 16:38:10.893388 13633 net.cpp:260] Setting up conv1b/relu I1106 16:38:10.893390 13633 net.cpp:267] TEST Top shape for layer 8 'conv1b/relu' 10 32 160 384 (19660800) I1106 16:38:10.893393 13633 layer_factory.hpp:172] Creating layer 'pool1' of type 'Pooling' I1106 16:38:10.893394 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.893416 13633 net.cpp:200] Created Layer pool1 (9) I1106 16:38:10.893419 13633 net.cpp:572] pool1 <- conv1b I1106 16:38:10.893421 13633 net.cpp:542] pool1 -> pool1 I1106 16:38:10.893489 13633 net.cpp:260] Setting up pool1 I1106 16:38:10.893517 13633 net.cpp:267] TEST Top shape for layer 9 'pool1' 10 32 80 192 (4915200) I1106 16:38:10.893519 13633 layer_factory.hpp:172] Creating layer 'res2a_branch2a' of type 'Convolution' I1106 16:38:10.893522 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.893541 13633 net.cpp:200] Created Layer res2a_branch2a (10) I1106 16:38:10.893544 13633 net.cpp:572] res2a_branch2a <- pool1 I1106 16:38:10.893548 13633 net.cpp:542] res2a_branch2a -> res2a_branch2a I1106 16:38:10.894287 13633 net.cpp:260] Setting up res2a_branch2a I1106 16:38:10.894295 13633 net.cpp:267] TEST Top shape for layer 10 'res2a_branch2a' 10 64 80 192 (9830400) I1106 16:38:10.894302 13633 layer_factory.hpp:172] Creating layer 'res2a_branch2a/bn' of type 'BatchNorm' I1106 16:38:10.894306 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.894322 13633 net.cpp:200] Created Layer res2a_branch2a/bn (11) I1106 16:38:10.894325 13633 net.cpp:572] res2a_branch2a/bn <- res2a_branch2a I1106 16:38:10.894328 13633 net.cpp:527] res2a_branch2a/bn -> res2a_branch2a (in-place) I1106 16:38:10.894531 13633 net.cpp:260] Setting up res2a_branch2a/bn I1106 16:38:10.894534 13633 net.cpp:267] TEST Top shape for layer 11 'res2a_branch2a/bn' 10 64 80 192 (9830400) I1106 16:38:10.894558 13633 layer_factory.hpp:172] Creating layer 'res2a_branch2a/relu' of type 'ReLU' I1106 16:38:10.894560 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.894565 13633 net.cpp:200] Created Layer res2a_branch2a/relu (12) I1106 16:38:10.894567 13633 net.cpp:572] res2a_branch2a/relu <- res2a_branch2a I1106 16:38:10.894569 13633 net.cpp:527] res2a_branch2a/relu -> res2a_branch2a (in-place) I1106 16:38:10.894572 13633 net.cpp:260] Setting up res2a_branch2a/relu I1106 16:38:10.894577 13633 net.cpp:267] TEST Top shape for layer 12 'res2a_branch2a/relu' 10 64 80 192 (9830400) I1106 16:38:10.894578 13633 layer_factory.hpp:172] Creating layer 'res2a_branch2b' of type 'Convolution' I1106 16:38:10.894580 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.894587 13633 net.cpp:200] Created Layer res2a_branch2b (13) I1106 16:38:10.894590 13633 net.cpp:572] res2a_branch2b <- res2a_branch2a I1106 16:38:10.894593 13633 net.cpp:542] res2a_branch2b -> res2a_branch2b I1106 16:38:10.894791 13633 net.cpp:260] Setting up res2a_branch2b I1106 16:38:10.894798 13633 net.cpp:267] TEST Top shape for layer 13 'res2a_branch2b' 10 64 80 192 (9830400) I1106 16:38:10.894803 13633 layer_factory.hpp:172] Creating layer 'res2a_branch2b/bn' of type 'BatchNorm' I1106 16:38:10.894805 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.894809 13633 net.cpp:200] Created Layer res2a_branch2b/bn (14) I1106 16:38:10.894812 13633 net.cpp:572] res2a_branch2b/bn <- res2a_branch2b I1106 16:38:10.894816 13633 net.cpp:527] res2a_branch2b/bn -> res2a_branch2b (in-place) I1106 16:38:10.895004 13633 net.cpp:260] Setting up res2a_branch2b/bn I1106 16:38:10.895009 13633 net.cpp:267] TEST Top shape for layer 14 'res2a_branch2b/bn' 10 64 80 192 (9830400) I1106 16:38:10.895015 13633 layer_factory.hpp:172] Creating layer 'res2a_branch2b/relu' of type 'ReLU' I1106 16:38:10.895018 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.895021 13633 net.cpp:200] Created Layer res2a_branch2b/relu (15) I1106 16:38:10.895025 13633 net.cpp:572] res2a_branch2b/relu <- res2a_branch2b I1106 16:38:10.895026 13633 net.cpp:527] res2a_branch2b/relu -> res2a_branch2b (in-place) I1106 16:38:10.895030 13633 net.cpp:260] Setting up res2a_branch2b/relu I1106 16:38:10.895033 13633 net.cpp:267] TEST Top shape for layer 15 'res2a_branch2b/relu' 10 64 80 192 (9830400) I1106 16:38:10.895036 13633 layer_factory.hpp:172] Creating layer 'pool2' of type 'Pooling' I1106 16:38:10.895038 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.895043 13633 net.cpp:200] Created Layer pool2 (16) I1106 16:38:10.895046 13633 net.cpp:572] pool2 <- res2a_branch2b I1106 16:38:10.895048 13633 net.cpp:542] pool2 -> pool2 I1106 16:38:10.895077 13633 net.cpp:260] Setting up pool2 I1106 16:38:10.895082 13633 net.cpp:267] TEST Top shape for layer 16 'pool2' 10 64 40 96 (2457600) I1106 16:38:10.895085 13633 layer_factory.hpp:172] Creating layer 'res3a_branch2a' of type 'Convolution' I1106 16:38:10.895087 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.895094 13633 net.cpp:200] Created Layer res3a_branch2a (17) I1106 16:38:10.895097 13633 net.cpp:572] res3a_branch2a <- pool2 I1106 16:38:10.895100 13633 net.cpp:542] res3a_branch2a -> res3a_branch2a I1106 16:38:10.895754 13633 net.cpp:260] Setting up res3a_branch2a I1106 16:38:10.895767 13633 net.cpp:267] TEST Top shape for layer 17 'res3a_branch2a' 10 128 40 96 (4915200) I1106 16:38:10.895787 13633 layer_factory.hpp:172] Creating layer 'res3a_branch2a/bn' of type 'BatchNorm' I1106 16:38:10.895790 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.895795 13633 net.cpp:200] Created Layer res3a_branch2a/bn (18) I1106 16:38:10.895798 13633 net.cpp:572] res3a_branch2a/bn <- res3a_branch2a I1106 16:38:10.895800 13633 net.cpp:527] res3a_branch2a/bn -> res3a_branch2a (in-place) I1106 16:38:10.895977 13633 net.cpp:260] Setting up res3a_branch2a/bn I1106 16:38:10.895982 13633 net.cpp:267] TEST Top shape for layer 18 'res3a_branch2a/bn' 10 128 40 96 (4915200) I1106 16:38:10.896004 13633 layer_factory.hpp:172] Creating layer 'res3a_branch2a/relu' of type 'ReLU' I1106 16:38:10.896008 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.896011 13633 net.cpp:200] Created Layer res3a_branch2a/relu (19) I1106 16:38:10.896013 13633 net.cpp:572] res3a_branch2a/relu <- res3a_branch2a I1106 16:38:10.896015 13633 net.cpp:527] res3a_branch2a/relu -> res3a_branch2a (in-place) I1106 16:38:10.896019 13633 net.cpp:260] Setting up res3a_branch2a/relu I1106 16:38:10.896023 13633 net.cpp:267] TEST Top shape for layer 19 'res3a_branch2a/relu' 10 128 40 96 (4915200) I1106 16:38:10.896025 13633 layer_factory.hpp:172] Creating layer 'res3a_branch2b' of type 'Convolution' I1106 16:38:10.896028 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.896035 13633 net.cpp:200] Created Layer res3a_branch2b (20) I1106 16:38:10.896037 13633 net.cpp:572] res3a_branch2b <- res3a_branch2a I1106 16:38:10.896039 13633 net.cpp:542] res3a_branch2b -> res3a_branch2b I1106 16:38:10.896411 13633 net.cpp:260] Setting up res3a_branch2b I1106 16:38:10.896418 13633 net.cpp:267] TEST Top shape for layer 20 'res3a_branch2b' 10 128 40 96 (4915200) I1106 16:38:10.896422 13633 layer_factory.hpp:172] Creating layer 'res3a_branch2b/bn' of type 'BatchNorm' I1106 16:38:10.896425 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.896430 13633 net.cpp:200] Created Layer res3a_branch2b/bn (21) I1106 16:38:10.896432 13633 net.cpp:572] res3a_branch2b/bn <- res3a_branch2b I1106 16:38:10.896435 13633 net.cpp:527] res3a_branch2b/bn -> res3a_branch2b (in-place) I1106 16:38:10.896597 13633 net.cpp:260] Setting up res3a_branch2b/bn I1106 16:38:10.896601 13633 net.cpp:267] TEST Top shape for layer 21 'res3a_branch2b/bn' 10 128 40 96 (4915200) I1106 16:38:10.896608 13633 layer_factory.hpp:172] Creating layer 'res3a_branch2b/relu' of type 'ReLU' I1106 16:38:10.896611 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.896613 13633 net.cpp:200] Created Layer res3a_branch2b/relu (22) I1106 16:38:10.896616 13633 net.cpp:572] res3a_branch2b/relu <- res3a_branch2b I1106 16:38:10.896620 13633 net.cpp:527] res3a_branch2b/relu -> res3a_branch2b (in-place) I1106 16:38:10.896622 13633 net.cpp:260] Setting up res3a_branch2b/relu I1106 16:38:10.896625 13633 net.cpp:267] TEST Top shape for layer 22 'res3a_branch2b/relu' 10 128 40 96 (4915200) I1106 16:38:10.896628 13633 layer_factory.hpp:172] Creating layer 'pool3' of type 'Pooling' I1106 16:38:10.896631 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.896634 13633 net.cpp:200] Created Layer pool3 (23) I1106 16:38:10.896637 13633 net.cpp:572] pool3 <- res3a_branch2b I1106 16:38:10.896639 13633 net.cpp:542] pool3 -> pool3 I1106 16:38:10.896667 13633 net.cpp:260] Setting up pool3 I1106 16:38:10.896670 13633 net.cpp:267] TEST Top shape for layer 23 'pool3' 10 128 20 48 (1228800) I1106 16:38:10.896673 13633 layer_factory.hpp:172] Creating layer 'res4a_branch2a' of type 'Convolution' I1106 16:38:10.896677 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.896689 13633 net.cpp:200] Created Layer res4a_branch2a (24) I1106 16:38:10.896692 13633 net.cpp:572] res4a_branch2a <- pool3 I1106 16:38:10.896694 13633 net.cpp:542] res4a_branch2a -> res4a_branch2a I1106 16:38:10.899283 13633 net.cpp:260] Setting up res4a_branch2a I1106 16:38:10.899293 13633 net.cpp:267] TEST Top shape for layer 24 'res4a_branch2a' 10 256 20 48 (2457600) I1106 16:38:10.899312 13633 layer_factory.hpp:172] Creating layer 'res4a_branch2a/bn' of type 'BatchNorm' I1106 16:38:10.899315 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.899322 13633 net.cpp:200] Created Layer res4a_branch2a/bn (25) I1106 16:38:10.899324 13633 net.cpp:572] res4a_branch2a/bn <- res4a_branch2a I1106 16:38:10.899327 13633 net.cpp:527] res4a_branch2a/bn -> res4a_branch2a (in-place) I1106 16:38:10.899500 13633 net.cpp:260] Setting up res4a_branch2a/bn I1106 16:38:10.899505 13633 net.cpp:267] TEST Top shape for layer 25 'res4a_branch2a/bn' 10 256 20 48 (2457600) I1106 16:38:10.899511 13633 layer_factory.hpp:172] Creating layer 'res4a_branch2a/relu' of type 'ReLU' I1106 16:38:10.899514 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.899518 13633 net.cpp:200] Created Layer res4a_branch2a/relu (26) I1106 16:38:10.899520 13633 net.cpp:572] res4a_branch2a/relu <- res4a_branch2a I1106 16:38:10.899523 13633 net.cpp:527] res4a_branch2a/relu -> res4a_branch2a (in-place) I1106 16:38:10.899526 13633 net.cpp:260] Setting up res4a_branch2a/relu I1106 16:38:10.899529 13633 net.cpp:267] TEST Top shape for layer 26 'res4a_branch2a/relu' 10 256 20 48 (2457600) I1106 16:38:10.899533 13633 layer_factory.hpp:172] Creating layer 'res4a_branch2b' of type 'Convolution' I1106 16:38:10.899535 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.899541 13633 net.cpp:200] Created Layer res4a_branch2b (27) I1106 16:38:10.899544 13633 net.cpp:572] res4a_branch2b <- res4a_branch2a I1106 16:38:10.899546 13633 net.cpp:542] res4a_branch2b -> res4a_branch2b I1106 16:38:10.900684 13633 net.cpp:260] Setting up res4a_branch2b I1106 16:38:10.900691 13633 net.cpp:267] TEST Top shape for layer 27 'res4a_branch2b' 10 256 20 48 (2457600) I1106 16:38:10.900696 13633 layer_factory.hpp:172] Creating layer 'res4a_branch2b/bn' of type 'BatchNorm' I1106 16:38:10.900698 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.900703 13633 net.cpp:200] Created Layer res4a_branch2b/bn (28) I1106 16:38:10.900707 13633 net.cpp:572] res4a_branch2b/bn <- res4a_branch2b I1106 16:38:10.900708 13633 net.cpp:527] res4a_branch2b/bn -> res4a_branch2b (in-place) I1106 16:38:10.900876 13633 net.cpp:260] Setting up res4a_branch2b/bn I1106 16:38:10.900880 13633 net.cpp:267] TEST Top shape for layer 28 'res4a_branch2b/bn' 10 256 20 48 (2457600) I1106 16:38:10.900887 13633 layer_factory.hpp:172] Creating layer 'res4a_branch2b/relu' of type 'ReLU' I1106 16:38:10.900889 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.900892 13633 net.cpp:200] Created Layer res4a_branch2b/relu (29) I1106 16:38:10.900895 13633 net.cpp:572] res4a_branch2b/relu <- res4a_branch2b I1106 16:38:10.900897 13633 net.cpp:527] res4a_branch2b/relu -> res4a_branch2b (in-place) I1106 16:38:10.900900 13633 net.cpp:260] Setting up res4a_branch2b/relu I1106 16:38:10.900902 13633 net.cpp:267] TEST Top shape for layer 29 'res4a_branch2b/relu' 10 256 20 48 (2457600) I1106 16:38:10.900905 13633 layer_factory.hpp:172] Creating layer 'res4a_branch2b_res4a_branch2b/relu_0_split' of type 'Split' I1106 16:38:10.900907 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.900910 13633 net.cpp:200] Created Layer res4a_branch2b_res4a_branch2b/relu_0_split (30) I1106 16:38:10.900913 13633 net.cpp:572] res4a_branch2b_res4a_branch2b/relu_0_split <- res4a_branch2b I1106 16:38:10.900924 13633 net.cpp:542] res4a_branch2b_res4a_branch2b/relu_0_split -> res4a_branch2b_res4a_branch2b/relu_0_split_0 I1106 16:38:10.900928 13633 net.cpp:542] res4a_branch2b_res4a_branch2b/relu_0_split -> res4a_branch2b_res4a_branch2b/relu_0_split_1 I1106 16:38:10.900949 13633 net.cpp:260] Setting up res4a_branch2b_res4a_branch2b/relu_0_split I1106 16:38:10.900954 13633 net.cpp:267] TEST Top shape for layer 30 'res4a_branch2b_res4a_branch2b/relu_0_split' 10 256 20 48 (2457600) I1106 16:38:10.900956 13633 net.cpp:267] TEST Top shape for layer 30 'res4a_branch2b_res4a_branch2b/relu_0_split' 10 256 20 48 (2457600) I1106 16:38:10.900959 13633 layer_factory.hpp:172] Creating layer 'pool4' of type 'Pooling' I1106 16:38:10.900960 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.900965 13633 net.cpp:200] Created Layer pool4 (31) I1106 16:38:10.900967 13633 net.cpp:572] pool4 <- res4a_branch2b_res4a_branch2b/relu_0_split_0 I1106 16:38:10.900970 13633 net.cpp:542] pool4 -> pool4 I1106 16:38:10.900995 13633 net.cpp:260] Setting up pool4 I1106 16:38:10.900998 13633 net.cpp:267] TEST Top shape for layer 31 'pool4' 10 256 10 24 (614400) I1106 16:38:10.901001 13633 layer_factory.hpp:172] Creating layer 'res5a_branch2a' of type 'Convolution' I1106 16:38:10.901003 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.901010 13633 net.cpp:200] Created Layer res5a_branch2a (32) I1106 16:38:10.901012 13633 net.cpp:572] res5a_branch2a <- pool4 I1106 16:38:10.901015 13633 net.cpp:542] res5a_branch2a -> res5a_branch2a I1106 16:38:10.910123 13633 net.cpp:260] Setting up res5a_branch2a I1106 16:38:10.910141 13633 net.cpp:267] TEST Top shape for layer 32 'res5a_branch2a' 10 512 10 24 (1228800) I1106 16:38:10.910151 13633 layer_factory.hpp:172] Creating layer 'res5a_branch2a/bn' of type 'BatchNorm' I1106 16:38:10.910156 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.910163 13633 net.cpp:200] Created Layer res5a_branch2a/bn (33) I1106 16:38:10.910166 13633 net.cpp:572] res5a_branch2a/bn <- res5a_branch2a I1106 16:38:10.910171 13633 net.cpp:527] res5a_branch2a/bn -> res5a_branch2a (in-place) I1106 16:38:10.910357 13633 net.cpp:260] Setting up res5a_branch2a/bn I1106 16:38:10.910362 13633 net.cpp:267] TEST Top shape for layer 33 'res5a_branch2a/bn' 10 512 10 24 (1228800) I1106 16:38:10.910368 13633 layer_factory.hpp:172] Creating layer 'res5a_branch2a/relu' of type 'ReLU' I1106 16:38:10.910372 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.910375 13633 net.cpp:200] Created Layer res5a_branch2a/relu (34) I1106 16:38:10.910377 13633 net.cpp:572] res5a_branch2a/relu <- res5a_branch2a I1106 16:38:10.910380 13633 net.cpp:527] res5a_branch2a/relu -> res5a_branch2a (in-place) I1106 16:38:10.910384 13633 net.cpp:260] Setting up res5a_branch2a/relu I1106 16:38:10.910387 13633 net.cpp:267] TEST Top shape for layer 34 'res5a_branch2a/relu' 10 512 10 24 (1228800) I1106 16:38:10.910390 13633 layer_factory.hpp:172] Creating layer 'res5a_branch2b' of type 'Convolution' I1106 16:38:10.910393 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.910400 13633 net.cpp:200] Created Layer res5a_branch2b (35) I1106 16:38:10.910403 13633 net.cpp:572] res5a_branch2b <- res5a_branch2a I1106 16:38:10.910406 13633 net.cpp:542] res5a_branch2b -> res5a_branch2b I1106 16:38:10.915112 13633 net.cpp:260] Setting up res5a_branch2b I1106 16:38:10.915125 13633 net.cpp:267] TEST Top shape for layer 35 'res5a_branch2b' 10 512 10 24 (1228800) I1106 16:38:10.915134 13633 layer_factory.hpp:172] Creating layer 'res5a_branch2b/bn' of type 'BatchNorm' I1106 16:38:10.915138 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.915144 13633 net.cpp:200] Created Layer res5a_branch2b/bn (36) I1106 16:38:10.915158 13633 net.cpp:572] res5a_branch2b/bn <- res5a_branch2b I1106 16:38:10.915163 13633 net.cpp:527] res5a_branch2b/bn -> res5a_branch2b (in-place) I1106 16:38:10.915369 13633 net.cpp:260] Setting up res5a_branch2b/bn I1106 16:38:10.915374 13633 net.cpp:267] TEST Top shape for layer 36 'res5a_branch2b/bn' 10 512 10 24 (1228800) I1106 16:38:10.915380 13633 layer_factory.hpp:172] Creating layer 'res5a_branch2b/relu' of type 'ReLU' I1106 16:38:10.915383 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.915387 13633 net.cpp:200] Created Layer res5a_branch2b/relu (37) I1106 16:38:10.915390 13633 net.cpp:572] res5a_branch2b/relu <- res5a_branch2b I1106 16:38:10.915392 13633 net.cpp:527] res5a_branch2b/relu -> res5a_branch2b (in-place) I1106 16:38:10.915398 13633 net.cpp:260] Setting up res5a_branch2b/relu I1106 16:38:10.915401 13633 net.cpp:267] TEST Top shape for layer 37 'res5a_branch2b/relu' 10 512 10 24 (1228800) I1106 16:38:10.915403 13633 layer_factory.hpp:172] Creating layer 'res5a_branch2b_res5a_branch2b/relu_0_split' of type 'Split' I1106 16:38:10.915405 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.915410 13633 net.cpp:200] Created Layer res5a_branch2b_res5a_branch2b/relu_0_split (38) I1106 16:38:10.915412 13633 net.cpp:572] res5a_branch2b_res5a_branch2b/relu_0_split <- res5a_branch2b I1106 16:38:10.915416 13633 net.cpp:542] res5a_branch2b_res5a_branch2b/relu_0_split -> res5a_branch2b_res5a_branch2b/relu_0_split_0 I1106 16:38:10.915418 13633 net.cpp:542] res5a_branch2b_res5a_branch2b/relu_0_split -> res5a_branch2b_res5a_branch2b/relu_0_split_1 I1106 16:38:10.915442 13633 net.cpp:260] Setting up res5a_branch2b_res5a_branch2b/relu_0_split I1106 16:38:10.915446 13633 net.cpp:267] TEST Top shape for layer 38 'res5a_branch2b_res5a_branch2b/relu_0_split' 10 512 10 24 (1228800) I1106 16:38:10.915449 13633 net.cpp:267] TEST Top shape for layer 38 'res5a_branch2b_res5a_branch2b/relu_0_split' 10 512 10 24 (1228800) I1106 16:38:10.915452 13633 layer_factory.hpp:172] Creating layer 'pool6' of type 'Pooling' I1106 16:38:10.915454 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.915459 13633 net.cpp:200] Created Layer pool6 (39) I1106 16:38:10.915462 13633 net.cpp:572] pool6 <- res5a_branch2b_res5a_branch2b/relu_0_split_0 I1106 16:38:10.915465 13633 net.cpp:542] pool6 -> pool6 I1106 16:38:10.915495 13633 net.cpp:260] Setting up pool6 I1106 16:38:10.915499 13633 net.cpp:267] TEST Top shape for layer 39 'pool6' 10 512 5 12 (307200) I1106 16:38:10.915503 13633 layer_factory.hpp:172] Creating layer 'pool6_pool6_0_split' of type 'Split' I1106 16:38:10.915504 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.915509 13633 net.cpp:200] Created Layer pool6_pool6_0_split (40) I1106 16:38:10.915511 13633 net.cpp:572] pool6_pool6_0_split <- pool6 I1106 16:38:10.915514 13633 net.cpp:542] pool6_pool6_0_split -> pool6_pool6_0_split_0 I1106 16:38:10.915518 13633 net.cpp:542] pool6_pool6_0_split -> pool6_pool6_0_split_1 I1106 16:38:10.915539 13633 net.cpp:260] Setting up pool6_pool6_0_split I1106 16:38:10.915544 13633 net.cpp:267] TEST Top shape for layer 40 'pool6_pool6_0_split' 10 512 5 12 (307200) I1106 16:38:10.915546 13633 net.cpp:267] TEST Top shape for layer 40 'pool6_pool6_0_split' 10 512 5 12 (307200) I1106 16:38:10.915549 13633 layer_factory.hpp:172] Creating layer 'pool7' of type 'Pooling' I1106 16:38:10.915551 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.915555 13633 net.cpp:200] Created Layer pool7 (41) I1106 16:38:10.915558 13633 net.cpp:572] pool7 <- pool6_pool6_0_split_0 I1106 16:38:10.915560 13633 net.cpp:542] pool7 -> pool7 I1106 16:38:10.915586 13633 net.cpp:260] Setting up pool7 I1106 16:38:10.915591 13633 net.cpp:267] TEST Top shape for layer 41 'pool7' 10 512 3 6 (92160) I1106 16:38:10.915593 13633 layer_factory.hpp:172] Creating layer 'pool7_pool7_0_split' of type 'Split' I1106 16:38:10.915601 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.915606 13633 net.cpp:200] Created Layer pool7_pool7_0_split (42) I1106 16:38:10.915608 13633 net.cpp:572] pool7_pool7_0_split <- pool7 I1106 16:38:10.915611 13633 net.cpp:542] pool7_pool7_0_split -> pool7_pool7_0_split_0 I1106 16:38:10.915614 13633 net.cpp:542] pool7_pool7_0_split -> pool7_pool7_0_split_1 I1106 16:38:10.915634 13633 net.cpp:260] Setting up pool7_pool7_0_split I1106 16:38:10.915638 13633 net.cpp:267] TEST Top shape for layer 42 'pool7_pool7_0_split' 10 512 3 6 (92160) I1106 16:38:10.915642 13633 net.cpp:267] TEST Top shape for layer 42 'pool7_pool7_0_split' 10 512 3 6 (92160) I1106 16:38:10.915643 13633 layer_factory.hpp:172] Creating layer 'pool8' of type 'Pooling' I1106 16:38:10.915647 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.915650 13633 net.cpp:200] Created Layer pool8 (43) I1106 16:38:10.915653 13633 net.cpp:572] pool8 <- pool7_pool7_0_split_0 I1106 16:38:10.915655 13633 net.cpp:542] pool8 -> pool8 I1106 16:38:10.915688 13633 net.cpp:260] Setting up pool8 I1106 16:38:10.915694 13633 net.cpp:267] TEST Top shape for layer 43 'pool8' 10 512 2 3 (30720) I1106 16:38:10.915696 13633 layer_factory.hpp:172] Creating layer 'ctx_output1' of type 'Convolution' I1106 16:38:10.915699 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.915709 13633 net.cpp:200] Created Layer ctx_output1 (44) I1106 16:38:10.915712 13633 net.cpp:572] ctx_output1 <- res4a_branch2b_res4a_branch2b/relu_0_split_1 I1106 16:38:10.915715 13633 net.cpp:542] ctx_output1 -> ctx_output1 I1106 16:38:10.916321 13633 net.cpp:260] Setting up ctx_output1 I1106 16:38:10.916326 13633 net.cpp:267] TEST Top shape for layer 44 'ctx_output1' 10 256 20 48 (2457600) I1106 16:38:10.916332 13633 layer_factory.hpp:172] Creating layer 'ctx_output1/relu' of type 'ReLU' I1106 16:38:10.916334 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.916338 13633 net.cpp:200] Created Layer ctx_output1/relu (45) I1106 16:38:10.916342 13633 net.cpp:572] ctx_output1/relu <- ctx_output1 I1106 16:38:10.916343 13633 net.cpp:527] ctx_output1/relu -> ctx_output1 (in-place) I1106 16:38:10.916348 13633 net.cpp:260] Setting up ctx_output1/relu I1106 16:38:10.916350 13633 net.cpp:267] TEST Top shape for layer 45 'ctx_output1/relu' 10 256 20 48 (2457600) I1106 16:38:10.916354 13633 layer_factory.hpp:172] Creating layer 'ctx_output1_ctx_output1/relu_0_split' of type 'Split' I1106 16:38:10.916357 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.916360 13633 net.cpp:200] Created Layer ctx_output1_ctx_output1/relu_0_split (46) I1106 16:38:10.916363 13633 net.cpp:572] ctx_output1_ctx_output1/relu_0_split <- ctx_output1 I1106 16:38:10.916365 13633 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_0 I1106 16:38:10.916369 13633 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_1 I1106 16:38:10.916373 13633 net.cpp:542] ctx_output1_ctx_output1/relu_0_split -> ctx_output1_ctx_output1/relu_0_split_2 I1106 16:38:10.916407 13633 net.cpp:260] Setting up ctx_output1_ctx_output1/relu_0_split I1106 16:38:10.916411 13633 net.cpp:267] TEST Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 10 256 20 48 (2457600) I1106 16:38:10.916415 13633 net.cpp:267] TEST Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 10 256 20 48 (2457600) I1106 16:38:10.916419 13633 net.cpp:267] TEST Top shape for layer 46 'ctx_output1_ctx_output1/relu_0_split' 10 256 20 48 (2457600) I1106 16:38:10.916420 13633 layer_factory.hpp:172] Creating layer 'ctx_output2' of type 'Convolution' I1106 16:38:10.916424 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.916440 13633 net.cpp:200] Created Layer ctx_output2 (47) I1106 16:38:10.916442 13633 net.cpp:572] ctx_output2 <- res5a_branch2b_res5a_branch2b/relu_0_split_1 I1106 16:38:10.916445 13633 net.cpp:542] ctx_output2 -> ctx_output2 I1106 16:38:10.917480 13633 net.cpp:260] Setting up ctx_output2 I1106 16:38:10.917488 13633 net.cpp:267] TEST Top shape for layer 47 'ctx_output2' 10 256 10 24 (614400) I1106 16:38:10.917493 13633 layer_factory.hpp:172] Creating layer 'ctx_output2/relu' of type 'ReLU' I1106 16:38:10.917496 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.917500 13633 net.cpp:200] Created Layer ctx_output2/relu (48) I1106 16:38:10.917502 13633 net.cpp:572] ctx_output2/relu <- ctx_output2 I1106 16:38:10.917505 13633 net.cpp:527] ctx_output2/relu -> ctx_output2 (in-place) I1106 16:38:10.917508 13633 net.cpp:260] Setting up ctx_output2/relu I1106 16:38:10.917512 13633 net.cpp:267] TEST Top shape for layer 48 'ctx_output2/relu' 10 256 10 24 (614400) I1106 16:38:10.917515 13633 layer_factory.hpp:172] Creating layer 'ctx_output2_ctx_output2/relu_0_split' of type 'Split' I1106 16:38:10.917516 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.917520 13633 net.cpp:200] Created Layer ctx_output2_ctx_output2/relu_0_split (49) I1106 16:38:10.917523 13633 net.cpp:572] ctx_output2_ctx_output2/relu_0_split <- ctx_output2 I1106 16:38:10.917526 13633 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_0 I1106 16:38:10.917531 13633 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_1 I1106 16:38:10.917533 13633 net.cpp:542] ctx_output2_ctx_output2/relu_0_split -> ctx_output2_ctx_output2/relu_0_split_2 I1106 16:38:10.917563 13633 net.cpp:260] Setting up ctx_output2_ctx_output2/relu_0_split I1106 16:38:10.917567 13633 net.cpp:267] TEST Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 10 256 10 24 (614400) I1106 16:38:10.917570 13633 net.cpp:267] TEST Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 10 256 10 24 (614400) I1106 16:38:10.917572 13633 net.cpp:267] TEST Top shape for layer 49 'ctx_output2_ctx_output2/relu_0_split' 10 256 10 24 (614400) I1106 16:38:10.917575 13633 layer_factory.hpp:172] Creating layer 'ctx_output3' of type 'Convolution' I1106 16:38:10.917578 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.917587 13633 net.cpp:200] Created Layer ctx_output3 (50) I1106 16:38:10.917589 13633 net.cpp:572] ctx_output3 <- pool6_pool6_0_split_1 I1106 16:38:10.917593 13633 net.cpp:542] ctx_output3 -> ctx_output3 I1106 16:38:10.919149 13633 net.cpp:260] Setting up ctx_output3 I1106 16:38:10.919160 13633 net.cpp:267] TEST Top shape for layer 50 'ctx_output3' 10 256 5 12 (153600) I1106 16:38:10.919165 13633 layer_factory.hpp:172] Creating layer 'ctx_output3/relu' of type 'ReLU' I1106 16:38:10.919168 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.919173 13633 net.cpp:200] Created Layer ctx_output3/relu (51) I1106 16:38:10.919176 13633 net.cpp:572] ctx_output3/relu <- ctx_output3 I1106 16:38:10.919179 13633 net.cpp:527] ctx_output3/relu -> ctx_output3 (in-place) I1106 16:38:10.919184 13633 net.cpp:260] Setting up ctx_output3/relu I1106 16:38:10.919188 13633 net.cpp:267] TEST Top shape for layer 51 'ctx_output3/relu' 10 256 5 12 (153600) I1106 16:38:10.919191 13633 layer_factory.hpp:172] Creating layer 'ctx_output3_ctx_output3/relu_0_split' of type 'Split' I1106 16:38:10.919193 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.919198 13633 net.cpp:200] Created Layer ctx_output3_ctx_output3/relu_0_split (52) I1106 16:38:10.919200 13633 net.cpp:572] ctx_output3_ctx_output3/relu_0_split <- ctx_output3 I1106 16:38:10.919203 13633 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_0 I1106 16:38:10.919219 13633 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_1 I1106 16:38:10.919222 13633 net.cpp:542] ctx_output3_ctx_output3/relu_0_split -> ctx_output3_ctx_output3/relu_0_split_2 I1106 16:38:10.919255 13633 net.cpp:260] Setting up ctx_output3_ctx_output3/relu_0_split I1106 16:38:10.919260 13633 net.cpp:267] TEST Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 10 256 5 12 (153600) I1106 16:38:10.919263 13633 net.cpp:267] TEST Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 10 256 5 12 (153600) I1106 16:38:10.919267 13633 net.cpp:267] TEST Top shape for layer 52 'ctx_output3_ctx_output3/relu_0_split' 10 256 5 12 (153600) I1106 16:38:10.919270 13633 layer_factory.hpp:172] Creating layer 'ctx_output4' of type 'Convolution' I1106 16:38:10.919272 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.919282 13633 net.cpp:200] Created Layer ctx_output4 (53) I1106 16:38:10.919286 13633 net.cpp:572] ctx_output4 <- pool7_pool7_0_split_1 I1106 16:38:10.919288 13633 net.cpp:542] ctx_output4 -> ctx_output4 I1106 16:38:10.920331 13633 net.cpp:260] Setting up ctx_output4 I1106 16:38:10.920338 13633 net.cpp:267] TEST Top shape for layer 53 'ctx_output4' 10 256 3 6 (46080) I1106 16:38:10.920343 13633 layer_factory.hpp:172] Creating layer 'ctx_output4/relu' of type 'ReLU' I1106 16:38:10.920346 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.920351 13633 net.cpp:200] Created Layer ctx_output4/relu (54) I1106 16:38:10.920353 13633 net.cpp:572] ctx_output4/relu <- ctx_output4 I1106 16:38:10.920356 13633 net.cpp:527] ctx_output4/relu -> ctx_output4 (in-place) I1106 16:38:10.920361 13633 net.cpp:260] Setting up ctx_output4/relu I1106 16:38:10.920363 13633 net.cpp:267] TEST Top shape for layer 54 'ctx_output4/relu' 10 256 3 6 (46080) I1106 16:38:10.920367 13633 layer_factory.hpp:172] Creating layer 'ctx_output4_ctx_output4/relu_0_split' of type 'Split' I1106 16:38:10.920369 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.920372 13633 net.cpp:200] Created Layer ctx_output4_ctx_output4/relu_0_split (55) I1106 16:38:10.920374 13633 net.cpp:572] ctx_output4_ctx_output4/relu_0_split <- ctx_output4 I1106 16:38:10.920377 13633 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_0 I1106 16:38:10.920382 13633 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_1 I1106 16:38:10.920387 13633 net.cpp:542] ctx_output4_ctx_output4/relu_0_split -> ctx_output4_ctx_output4/relu_0_split_2 I1106 16:38:10.920418 13633 net.cpp:260] Setting up ctx_output4_ctx_output4/relu_0_split I1106 16:38:10.920421 13633 net.cpp:267] TEST Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 10 256 3 6 (46080) I1106 16:38:10.920425 13633 net.cpp:267] TEST Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 10 256 3 6 (46080) I1106 16:38:10.920428 13633 net.cpp:267] TEST Top shape for layer 55 'ctx_output4_ctx_output4/relu_0_split' 10 256 3 6 (46080) I1106 16:38:10.920430 13633 layer_factory.hpp:172] Creating layer 'ctx_output5' of type 'Convolution' I1106 16:38:10.920433 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.920444 13633 net.cpp:200] Created Layer ctx_output5 (56) I1106 16:38:10.920445 13633 net.cpp:572] ctx_output5 <- pool8 I1106 16:38:10.920449 13633 net.cpp:542] ctx_output5 -> ctx_output5 I1106 16:38:10.921486 13633 net.cpp:260] Setting up ctx_output5 I1106 16:38:10.921492 13633 net.cpp:267] TEST Top shape for layer 56 'ctx_output5' 10 256 2 3 (15360) I1106 16:38:10.921496 13633 layer_factory.hpp:172] Creating layer 'ctx_output5/relu' of type 'ReLU' I1106 16:38:10.921499 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.921504 13633 net.cpp:200] Created Layer ctx_output5/relu (57) I1106 16:38:10.921506 13633 net.cpp:572] ctx_output5/relu <- ctx_output5 I1106 16:38:10.921517 13633 net.cpp:527] ctx_output5/relu -> ctx_output5 (in-place) I1106 16:38:10.921521 13633 net.cpp:260] Setting up ctx_output5/relu I1106 16:38:10.921525 13633 net.cpp:267] TEST Top shape for layer 57 'ctx_output5/relu' 10 256 2 3 (15360) I1106 16:38:10.921528 13633 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc' of type 'Convolution' I1106 16:38:10.921530 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.921541 13633 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc (58) I1106 16:38:10.921545 13633 net.cpp:572] ctx_output1/relu_mbox_loc <- ctx_output1_ctx_output1/relu_0_split_0 I1106 16:38:10.921548 13633 net.cpp:542] ctx_output1/relu_mbox_loc -> ctx_output1/relu_mbox_loc I1106 16:38:10.921720 13633 net.cpp:260] Setting up ctx_output1/relu_mbox_loc I1106 16:38:10.921725 13633 net.cpp:267] TEST Top shape for layer 58 'ctx_output1/relu_mbox_loc' 10 16 20 48 (153600) I1106 16:38:10.921730 13633 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:10.921734 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.921747 13633 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc_perm (59) I1106 16:38:10.921751 13633 net.cpp:572] ctx_output1/relu_mbox_loc_perm <- ctx_output1/relu_mbox_loc I1106 16:38:10.921753 13633 net.cpp:542] ctx_output1/relu_mbox_loc_perm -> ctx_output1/relu_mbox_loc_perm I1106 16:38:10.921820 13633 net.cpp:260] Setting up ctx_output1/relu_mbox_loc_perm I1106 16:38:10.921824 13633 net.cpp:267] TEST Top shape for layer 59 'ctx_output1/relu_mbox_loc_perm' 10 20 48 16 (153600) I1106 16:38:10.921828 13633 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:10.921830 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.921839 13633 net.cpp:200] Created Layer ctx_output1/relu_mbox_loc_flat (60) I1106 16:38:10.921842 13633 net.cpp:572] ctx_output1/relu_mbox_loc_flat <- ctx_output1/relu_mbox_loc_perm I1106 16:38:10.921844 13633 net.cpp:542] ctx_output1/relu_mbox_loc_flat -> ctx_output1/relu_mbox_loc_flat I1106 16:38:10.922643 13633 net.cpp:260] Setting up ctx_output1/relu_mbox_loc_flat I1106 16:38:10.922653 13633 net.cpp:267] TEST Top shape for layer 60 'ctx_output1/relu_mbox_loc_flat' 10 15360 (153600) I1106 16:38:10.922657 13633 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf' of type 'Convolution' I1106 16:38:10.922660 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.922672 13633 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf (61) I1106 16:38:10.922674 13633 net.cpp:572] ctx_output1/relu_mbox_conf <- ctx_output1_ctx_output1/relu_0_split_1 I1106 16:38:10.922679 13633 net.cpp:542] ctx_output1/relu_mbox_conf -> ctx_output1/relu_mbox_conf I1106 16:38:10.922842 13633 net.cpp:260] Setting up ctx_output1/relu_mbox_conf I1106 16:38:10.922848 13633 net.cpp:267] TEST Top shape for layer 61 'ctx_output1/relu_mbox_conf' 10 8 20 48 (76800) I1106 16:38:10.922853 13633 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:10.922857 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.922863 13633 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf_perm (62) I1106 16:38:10.922866 13633 net.cpp:572] ctx_output1/relu_mbox_conf_perm <- ctx_output1/relu_mbox_conf I1106 16:38:10.922870 13633 net.cpp:542] ctx_output1/relu_mbox_conf_perm -> ctx_output1/relu_mbox_conf_perm I1106 16:38:10.922927 13633 net.cpp:260] Setting up ctx_output1/relu_mbox_conf_perm I1106 16:38:10.922931 13633 net.cpp:267] TEST Top shape for layer 62 'ctx_output1/relu_mbox_conf_perm' 10 20 48 8 (76800) I1106 16:38:10.922935 13633 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:10.922945 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.922951 13633 net.cpp:200] Created Layer ctx_output1/relu_mbox_conf_flat (63) I1106 16:38:10.922955 13633 net.cpp:572] ctx_output1/relu_mbox_conf_flat <- ctx_output1/relu_mbox_conf_perm I1106 16:38:10.922956 13633 net.cpp:542] ctx_output1/relu_mbox_conf_flat -> ctx_output1/relu_mbox_conf_flat I1106 16:38:10.923015 13633 net.cpp:260] Setting up ctx_output1/relu_mbox_conf_flat I1106 16:38:10.923020 13633 net.cpp:267] TEST Top shape for layer 63 'ctx_output1/relu_mbox_conf_flat' 10 7680 (76800) I1106 16:38:10.923023 13633 layer_factory.hpp:172] Creating layer 'ctx_output1/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:10.923027 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.923038 13633 net.cpp:200] Created Layer ctx_output1/relu_mbox_priorbox (64) I1106 16:38:10.923041 13633 net.cpp:572] ctx_output1/relu_mbox_priorbox <- ctx_output1_ctx_output1/relu_0_split_2 I1106 16:38:10.923043 13633 net.cpp:572] ctx_output1/relu_mbox_priorbox <- data_data_0_split_1 I1106 16:38:10.923048 13633 net.cpp:542] ctx_output1/relu_mbox_priorbox -> ctx_output1/relu_mbox_priorbox I1106 16:38:10.923063 13633 net.cpp:260] Setting up ctx_output1/relu_mbox_priorbox I1106 16:38:10.923068 13633 net.cpp:267] TEST Top shape for layer 64 'ctx_output1/relu_mbox_priorbox' 1 2 15360 (30720) I1106 16:38:10.923070 13633 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc' of type 'Convolution' I1106 16:38:10.923072 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.923081 13633 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc (65) I1106 16:38:10.923084 13633 net.cpp:572] ctx_output2/relu_mbox_loc <- ctx_output2_ctx_output2/relu_0_split_0 I1106 16:38:10.923087 13633 net.cpp:542] ctx_output2/relu_mbox_loc -> ctx_output2/relu_mbox_loc I1106 16:38:10.923282 13633 net.cpp:260] Setting up ctx_output2/relu_mbox_loc I1106 16:38:10.923288 13633 net.cpp:267] TEST Top shape for layer 65 'ctx_output2/relu_mbox_loc' 10 24 10 24 (57600) I1106 16:38:10.923293 13633 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:10.923296 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.923301 13633 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc_perm (66) I1106 16:38:10.923305 13633 net.cpp:572] ctx_output2/relu_mbox_loc_perm <- ctx_output2/relu_mbox_loc I1106 16:38:10.923308 13633 net.cpp:542] ctx_output2/relu_mbox_loc_perm -> ctx_output2/relu_mbox_loc_perm I1106 16:38:10.923362 13633 net.cpp:260] Setting up ctx_output2/relu_mbox_loc_perm I1106 16:38:10.923367 13633 net.cpp:267] TEST Top shape for layer 66 'ctx_output2/relu_mbox_loc_perm' 10 10 24 24 (57600) I1106 16:38:10.923370 13633 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:10.923373 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.923377 13633 net.cpp:200] Created Layer ctx_output2/relu_mbox_loc_flat (67) I1106 16:38:10.923379 13633 net.cpp:572] ctx_output2/relu_mbox_loc_flat <- ctx_output2/relu_mbox_loc_perm I1106 16:38:10.923382 13633 net.cpp:542] ctx_output2/relu_mbox_loc_flat -> ctx_output2/relu_mbox_loc_flat I1106 16:38:10.923426 13633 net.cpp:260] Setting up ctx_output2/relu_mbox_loc_flat I1106 16:38:10.923431 13633 net.cpp:267] TEST Top shape for layer 67 'ctx_output2/relu_mbox_loc_flat' 10 5760 (57600) I1106 16:38:10.923434 13633 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf' of type 'Convolution' I1106 16:38:10.923437 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.923446 13633 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf (68) I1106 16:38:10.923449 13633 net.cpp:572] ctx_output2/relu_mbox_conf <- ctx_output2_ctx_output2/relu_0_split_1 I1106 16:38:10.923460 13633 net.cpp:542] ctx_output2/relu_mbox_conf -> ctx_output2/relu_mbox_conf I1106 16:38:10.923622 13633 net.cpp:260] Setting up ctx_output2/relu_mbox_conf I1106 16:38:10.923629 13633 net.cpp:267] TEST Top shape for layer 68 'ctx_output2/relu_mbox_conf' 10 12 10 24 (28800) I1106 16:38:10.923633 13633 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:10.923636 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.923642 13633 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf_perm (69) I1106 16:38:10.923646 13633 net.cpp:572] ctx_output2/relu_mbox_conf_perm <- ctx_output2/relu_mbox_conf I1106 16:38:10.923650 13633 net.cpp:542] ctx_output2/relu_mbox_conf_perm -> ctx_output2/relu_mbox_conf_perm I1106 16:38:10.923717 13633 net.cpp:260] Setting up ctx_output2/relu_mbox_conf_perm I1106 16:38:10.923722 13633 net.cpp:267] TEST Top shape for layer 69 'ctx_output2/relu_mbox_conf_perm' 10 10 24 12 (28800) I1106 16:38:10.923727 13633 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:10.923728 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.923732 13633 net.cpp:200] Created Layer ctx_output2/relu_mbox_conf_flat (70) I1106 16:38:10.923735 13633 net.cpp:572] ctx_output2/relu_mbox_conf_flat <- ctx_output2/relu_mbox_conf_perm I1106 16:38:10.923739 13633 net.cpp:542] ctx_output2/relu_mbox_conf_flat -> ctx_output2/relu_mbox_conf_flat I1106 16:38:10.923775 13633 net.cpp:260] Setting up ctx_output2/relu_mbox_conf_flat I1106 16:38:10.923780 13633 net.cpp:267] TEST Top shape for layer 70 'ctx_output2/relu_mbox_conf_flat' 10 2880 (28800) I1106 16:38:10.923784 13633 layer_factory.hpp:172] Creating layer 'ctx_output2/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:10.923786 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.923791 13633 net.cpp:200] Created Layer ctx_output2/relu_mbox_priorbox (71) I1106 16:38:10.923794 13633 net.cpp:572] ctx_output2/relu_mbox_priorbox <- ctx_output2_ctx_output2/relu_0_split_2 I1106 16:38:10.923796 13633 net.cpp:572] ctx_output2/relu_mbox_priorbox <- data_data_0_split_2 I1106 16:38:10.923800 13633 net.cpp:542] ctx_output2/relu_mbox_priorbox -> ctx_output2/relu_mbox_priorbox I1106 16:38:10.923815 13633 net.cpp:260] Setting up ctx_output2/relu_mbox_priorbox I1106 16:38:10.923820 13633 net.cpp:267] TEST Top shape for layer 71 'ctx_output2/relu_mbox_priorbox' 1 2 5760 (11520) I1106 16:38:10.923821 13633 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc' of type 'Convolution' I1106 16:38:10.923825 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.923831 13633 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc (72) I1106 16:38:10.923835 13633 net.cpp:572] ctx_output3/relu_mbox_loc <- ctx_output3_ctx_output3/relu_0_split_0 I1106 16:38:10.923837 13633 net.cpp:542] ctx_output3/relu_mbox_loc -> ctx_output3/relu_mbox_loc I1106 16:38:10.924031 13633 net.cpp:260] Setting up ctx_output3/relu_mbox_loc I1106 16:38:10.924036 13633 net.cpp:267] TEST Top shape for layer 72 'ctx_output3/relu_mbox_loc' 10 24 5 12 (14400) I1106 16:38:10.924041 13633 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:10.924044 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.924051 13633 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc_perm (73) I1106 16:38:10.924053 13633 net.cpp:572] ctx_output3/relu_mbox_loc_perm <- ctx_output3/relu_mbox_loc I1106 16:38:10.924055 13633 net.cpp:542] ctx_output3/relu_mbox_loc_perm -> ctx_output3/relu_mbox_loc_perm I1106 16:38:10.924113 13633 net.cpp:260] Setting up ctx_output3/relu_mbox_loc_perm I1106 16:38:10.924119 13633 net.cpp:267] TEST Top shape for layer 73 'ctx_output3/relu_mbox_loc_perm' 10 5 12 24 (14400) I1106 16:38:10.924129 13633 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:10.924131 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.924134 13633 net.cpp:200] Created Layer ctx_output3/relu_mbox_loc_flat (74) I1106 16:38:10.924137 13633 net.cpp:572] ctx_output3/relu_mbox_loc_flat <- ctx_output3/relu_mbox_loc_perm I1106 16:38:10.924140 13633 net.cpp:542] ctx_output3/relu_mbox_loc_flat -> ctx_output3/relu_mbox_loc_flat I1106 16:38:10.924180 13633 net.cpp:260] Setting up ctx_output3/relu_mbox_loc_flat I1106 16:38:10.924185 13633 net.cpp:267] TEST Top shape for layer 74 'ctx_output3/relu_mbox_loc_flat' 10 1440 (14400) I1106 16:38:10.924187 13633 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf' of type 'Convolution' I1106 16:38:10.924190 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.924198 13633 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf (75) I1106 16:38:10.924201 13633 net.cpp:572] ctx_output3/relu_mbox_conf <- ctx_output3_ctx_output3/relu_0_split_1 I1106 16:38:10.924206 13633 net.cpp:542] ctx_output3/relu_mbox_conf -> ctx_output3/relu_mbox_conf I1106 16:38:10.924365 13633 net.cpp:260] Setting up ctx_output3/relu_mbox_conf I1106 16:38:10.924369 13633 net.cpp:267] TEST Top shape for layer 75 'ctx_output3/relu_mbox_conf' 10 12 5 12 (7200) I1106 16:38:10.924374 13633 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:10.924377 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.924384 13633 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf_perm (76) I1106 16:38:10.924387 13633 net.cpp:572] ctx_output3/relu_mbox_conf_perm <- ctx_output3/relu_mbox_conf I1106 16:38:10.924391 13633 net.cpp:542] ctx_output3/relu_mbox_conf_perm -> ctx_output3/relu_mbox_conf_perm I1106 16:38:10.924446 13633 net.cpp:260] Setting up ctx_output3/relu_mbox_conf_perm I1106 16:38:10.924451 13633 net.cpp:267] TEST Top shape for layer 76 'ctx_output3/relu_mbox_conf_perm' 10 5 12 12 (7200) I1106 16:38:10.924454 13633 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:10.924458 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.924461 13633 net.cpp:200] Created Layer ctx_output3/relu_mbox_conf_flat (77) I1106 16:38:10.924464 13633 net.cpp:572] ctx_output3/relu_mbox_conf_flat <- ctx_output3/relu_mbox_conf_perm I1106 16:38:10.924468 13633 net.cpp:542] ctx_output3/relu_mbox_conf_flat -> ctx_output3/relu_mbox_conf_flat I1106 16:38:10.924505 13633 net.cpp:260] Setting up ctx_output3/relu_mbox_conf_flat I1106 16:38:10.924510 13633 net.cpp:267] TEST Top shape for layer 77 'ctx_output3/relu_mbox_conf_flat' 10 720 (7200) I1106 16:38:10.924512 13633 layer_factory.hpp:172] Creating layer 'ctx_output3/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:10.924515 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.924518 13633 net.cpp:200] Created Layer ctx_output3/relu_mbox_priorbox (78) I1106 16:38:10.924521 13633 net.cpp:572] ctx_output3/relu_mbox_priorbox <- ctx_output3_ctx_output3/relu_0_split_2 I1106 16:38:10.924525 13633 net.cpp:572] ctx_output3/relu_mbox_priorbox <- data_data_0_split_3 I1106 16:38:10.924527 13633 net.cpp:542] ctx_output3/relu_mbox_priorbox -> ctx_output3/relu_mbox_priorbox I1106 16:38:10.924541 13633 net.cpp:260] Setting up ctx_output3/relu_mbox_priorbox I1106 16:38:10.924546 13633 net.cpp:267] TEST Top shape for layer 78 'ctx_output3/relu_mbox_priorbox' 1 2 1440 (2880) I1106 16:38:10.924549 13633 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc' of type 'Convolution' I1106 16:38:10.924552 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.924559 13633 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc (79) I1106 16:38:10.924569 13633 net.cpp:572] ctx_output4/relu_mbox_loc <- ctx_output4_ctx_output4/relu_0_split_0 I1106 16:38:10.924572 13633 net.cpp:542] ctx_output4/relu_mbox_loc -> ctx_output4/relu_mbox_loc I1106 16:38:10.924741 13633 net.cpp:260] Setting up ctx_output4/relu_mbox_loc I1106 16:38:10.924746 13633 net.cpp:267] TEST Top shape for layer 79 'ctx_output4/relu_mbox_loc' 10 16 3 6 (2880) I1106 16:38:10.924751 13633 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc_perm' of type 'Permute' I1106 16:38:10.924754 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.924759 13633 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc_perm (80) I1106 16:38:10.924762 13633 net.cpp:572] ctx_output4/relu_mbox_loc_perm <- ctx_output4/relu_mbox_loc I1106 16:38:10.924765 13633 net.cpp:542] ctx_output4/relu_mbox_loc_perm -> ctx_output4/relu_mbox_loc_perm I1106 16:38:10.924819 13633 net.cpp:260] Setting up ctx_output4/relu_mbox_loc_perm I1106 16:38:10.924823 13633 net.cpp:267] TEST Top shape for layer 80 'ctx_output4/relu_mbox_loc_perm' 10 3 6 16 (2880) I1106 16:38:10.924826 13633 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_loc_flat' of type 'Flatten' I1106 16:38:10.924829 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.924834 13633 net.cpp:200] Created Layer ctx_output4/relu_mbox_loc_flat (81) I1106 16:38:10.924835 13633 net.cpp:572] ctx_output4/relu_mbox_loc_flat <- ctx_output4/relu_mbox_loc_perm I1106 16:38:10.924839 13633 net.cpp:542] ctx_output4/relu_mbox_loc_flat -> ctx_output4/relu_mbox_loc_flat I1106 16:38:10.924877 13633 net.cpp:260] Setting up ctx_output4/relu_mbox_loc_flat I1106 16:38:10.924881 13633 net.cpp:267] TEST Top shape for layer 81 'ctx_output4/relu_mbox_loc_flat' 10 288 (2880) I1106 16:38:10.924883 13633 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf' of type 'Convolution' I1106 16:38:10.924886 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.924896 13633 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf (82) I1106 16:38:10.924899 13633 net.cpp:572] ctx_output4/relu_mbox_conf <- ctx_output4_ctx_output4/relu_0_split_1 I1106 16:38:10.924901 13633 net.cpp:542] ctx_output4/relu_mbox_conf -> ctx_output4/relu_mbox_conf I1106 16:38:10.925060 13633 net.cpp:260] Setting up ctx_output4/relu_mbox_conf I1106 16:38:10.925065 13633 net.cpp:267] TEST Top shape for layer 82 'ctx_output4/relu_mbox_conf' 10 8 3 6 (1440) I1106 16:38:10.925070 13633 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf_perm' of type 'Permute' I1106 16:38:10.925072 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.925078 13633 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf_perm (83) I1106 16:38:10.925081 13633 net.cpp:572] ctx_output4/relu_mbox_conf_perm <- ctx_output4/relu_mbox_conf I1106 16:38:10.925084 13633 net.cpp:542] ctx_output4/relu_mbox_conf_perm -> ctx_output4/relu_mbox_conf_perm I1106 16:38:10.925141 13633 net.cpp:260] Setting up ctx_output4/relu_mbox_conf_perm I1106 16:38:10.925146 13633 net.cpp:267] TEST Top shape for layer 83 'ctx_output4/relu_mbox_conf_perm' 10 3 6 8 (1440) I1106 16:38:10.925149 13633 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_conf_flat' of type 'Flatten' I1106 16:38:10.925151 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.925155 13633 net.cpp:200] Created Layer ctx_output4/relu_mbox_conf_flat (84) I1106 16:38:10.925158 13633 net.cpp:572] ctx_output4/relu_mbox_conf_flat <- ctx_output4/relu_mbox_conf_perm I1106 16:38:10.925161 13633 net.cpp:542] ctx_output4/relu_mbox_conf_flat -> ctx_output4/relu_mbox_conf_flat I1106 16:38:10.925199 13633 net.cpp:260] Setting up ctx_output4/relu_mbox_conf_flat I1106 16:38:10.925204 13633 net.cpp:267] TEST Top shape for layer 84 'ctx_output4/relu_mbox_conf_flat' 10 144 (1440) I1106 16:38:10.925206 13633 layer_factory.hpp:172] Creating layer 'ctx_output4/relu_mbox_priorbox' of type 'PriorBox' I1106 16:38:10.925216 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.925222 13633 net.cpp:200] Created Layer ctx_output4/relu_mbox_priorbox (85) I1106 16:38:10.925225 13633 net.cpp:572] ctx_output4/relu_mbox_priorbox <- ctx_output4_ctx_output4/relu_0_split_2 I1106 16:38:10.925227 13633 net.cpp:572] ctx_output4/relu_mbox_priorbox <- data_data_0_split_4 I1106 16:38:10.925231 13633 net.cpp:542] ctx_output4/relu_mbox_priorbox -> ctx_output4/relu_mbox_priorbox I1106 16:38:10.925246 13633 net.cpp:260] Setting up ctx_output4/relu_mbox_priorbox I1106 16:38:10.925249 13633 net.cpp:267] TEST Top shape for layer 85 'ctx_output4/relu_mbox_priorbox' 1 2 288 (576) I1106 16:38:10.925251 13633 layer_factory.hpp:172] Creating layer 'mbox_loc' of type 'Concat' I1106 16:38:10.925254 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.925261 13633 net.cpp:200] Created Layer mbox_loc (86) I1106 16:38:10.925263 13633 net.cpp:572] mbox_loc <- ctx_output1/relu_mbox_loc_flat I1106 16:38:10.925266 13633 net.cpp:572] mbox_loc <- ctx_output2/relu_mbox_loc_flat I1106 16:38:10.925271 13633 net.cpp:572] mbox_loc <- ctx_output3/relu_mbox_loc_flat I1106 16:38:10.925273 13633 net.cpp:572] mbox_loc <- ctx_output4/relu_mbox_loc_flat I1106 16:38:10.925276 13633 net.cpp:542] mbox_loc -> mbox_loc I1106 16:38:10.925289 13633 net.cpp:260] Setting up mbox_loc I1106 16:38:10.925293 13633 net.cpp:267] TEST Top shape for layer 86 'mbox_loc' 10 22848 (228480) I1106 16:38:10.925297 13633 layer_factory.hpp:172] Creating layer 'mbox_conf' of type 'Concat' I1106 16:38:10.925298 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.925302 13633 net.cpp:200] Created Layer mbox_conf (87) I1106 16:38:10.925305 13633 net.cpp:572] mbox_conf <- ctx_output1/relu_mbox_conf_flat I1106 16:38:10.925307 13633 net.cpp:572] mbox_conf <- ctx_output2/relu_mbox_conf_flat I1106 16:38:10.925310 13633 net.cpp:572] mbox_conf <- ctx_output3/relu_mbox_conf_flat I1106 16:38:10.925313 13633 net.cpp:572] mbox_conf <- ctx_output4/relu_mbox_conf_flat I1106 16:38:10.925316 13633 net.cpp:542] mbox_conf -> mbox_conf I1106 16:38:10.925328 13633 net.cpp:260] Setting up mbox_conf I1106 16:38:10.925333 13633 net.cpp:267] TEST Top shape for layer 87 'mbox_conf' 10 11424 (114240) I1106 16:38:10.925334 13633 layer_factory.hpp:172] Creating layer 'mbox_priorbox' of type 'Concat' I1106 16:38:10.925338 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.925340 13633 net.cpp:200] Created Layer mbox_priorbox (88) I1106 16:38:10.925343 13633 net.cpp:572] mbox_priorbox <- ctx_output1/relu_mbox_priorbox I1106 16:38:10.925345 13633 net.cpp:572] mbox_priorbox <- ctx_output2/relu_mbox_priorbox I1106 16:38:10.925348 13633 net.cpp:572] mbox_priorbox <- ctx_output3/relu_mbox_priorbox I1106 16:38:10.925350 13633 net.cpp:572] mbox_priorbox <- ctx_output4/relu_mbox_priorbox I1106 16:38:10.925354 13633 net.cpp:542] mbox_priorbox -> mbox_priorbox I1106 16:38:10.925366 13633 net.cpp:260] Setting up mbox_priorbox I1106 16:38:10.925370 13633 net.cpp:267] TEST Top shape for layer 88 'mbox_priorbox' 1 2 22848 (45696) I1106 16:38:10.925372 13633 layer_factory.hpp:172] Creating layer 'mbox_conf_reshape' of type 'Reshape' I1106 16:38:10.925375 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.925381 13633 net.cpp:200] Created Layer mbox_conf_reshape (89) I1106 16:38:10.925384 13633 net.cpp:572] mbox_conf_reshape <- mbox_conf I1106 16:38:10.925387 13633 net.cpp:542] mbox_conf_reshape -> mbox_conf_reshape I1106 16:38:10.925402 13633 net.cpp:260] Setting up mbox_conf_reshape I1106 16:38:10.925406 13633 net.cpp:267] TEST Top shape for layer 89 'mbox_conf_reshape' 10 5712 2 (114240) I1106 16:38:10.925410 13633 layer_factory.hpp:172] Creating layer 'mbox_conf_softmax' of type 'Softmax' I1106 16:38:10.925417 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.925431 13633 net.cpp:200] Created Layer mbox_conf_softmax (90) I1106 16:38:10.925433 13633 net.cpp:572] mbox_conf_softmax <- mbox_conf_reshape I1106 16:38:10.925436 13633 net.cpp:542] mbox_conf_softmax -> mbox_conf_softmax I1106 16:38:10.925478 13633 net.cpp:260] Setting up mbox_conf_softmax I1106 16:38:10.925482 13633 net.cpp:267] TEST Top shape for layer 90 'mbox_conf_softmax' 10 5712 2 (114240) I1106 16:38:10.925484 13633 layer_factory.hpp:172] Creating layer 'mbox_conf_flatten' of type 'Flatten' I1106 16:38:10.925487 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.925490 13633 net.cpp:200] Created Layer mbox_conf_flatten (91) I1106 16:38:10.925493 13633 net.cpp:572] mbox_conf_flatten <- mbox_conf_softmax I1106 16:38:10.925495 13633 net.cpp:542] mbox_conf_flatten -> mbox_conf_flatten I1106 16:38:10.925550 13633 net.cpp:260] Setting up mbox_conf_flatten I1106 16:38:10.925554 13633 net.cpp:267] TEST Top shape for layer 91 'mbox_conf_flatten' 10 11424 (114240) I1106 16:38:10.925557 13633 layer_factory.hpp:172] Creating layer 'detection_out' of type 'DetectionOutput' I1106 16:38:10.925560 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.925572 13633 net.cpp:200] Created Layer detection_out (92) I1106 16:38:10.925575 13633 net.cpp:572] detection_out <- mbox_loc I1106 16:38:10.925577 13633 net.cpp:572] detection_out <- mbox_conf_flatten I1106 16:38:10.925580 13633 net.cpp:572] detection_out <- mbox_priorbox I1106 16:38:10.925585 13633 net.cpp:542] detection_out -> detection_out I1106 16:38:10.925680 13633 net.cpp:260] Setting up detection_out I1106 16:38:10.925686 13633 net.cpp:267] TEST Top shape for layer 92 'detection_out' 1 1 1 7 (7) I1106 16:38:10.925689 13633 layer_factory.hpp:172] Creating layer 'detection_eval' of type 'DetectionEvaluate' I1106 16:38:10.925693 13633 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1106 16:38:10.925698 13633 net.cpp:200] Created Layer detection_eval (93) I1106 16:38:10.925700 13633 net.cpp:572] detection_eval <- detection_out I1106 16:38:10.925704 13633 net.cpp:572] detection_eval <- label I1106 16:38:10.925706 13633 net.cpp:542] detection_eval -> detection_eval I1106 16:38:10.925741 13633 net.cpp:260] Setting up detection_eval I1106 16:38:10.925746 13633 net.cpp:267] TEST Top shape for layer 93 'detection_eval' 1 1 2 5 (10) I1106 16:38:10.925750 13633 net.cpp:338] detection_eval does not need backward computation. I1106 16:38:10.925752 13633 net.cpp:338] detection_out does not need backward computation. I1106 16:38:10.925755 13633 net.cpp:338] mbox_conf_flatten does not need backward computation. I1106 16:38:10.925756 13633 net.cpp:338] mbox_conf_softmax does not need backward computation. I1106 16:38:10.925760 13633 net.cpp:338] mbox_conf_reshape does not need backward computation. I1106 16:38:10.925762 13633 net.cpp:338] mbox_priorbox does not need backward computation. I1106 16:38:10.925765 13633 net.cpp:338] mbox_conf does not need backward computation. I1106 16:38:10.925767 13633 net.cpp:338] mbox_loc does not need backward computation. I1106 16:38:10.925770 13633 net.cpp:338] ctx_output4/relu_mbox_priorbox does not need backward computation. I1106 16:38:10.925773 13633 net.cpp:338] ctx_output4/relu_mbox_conf_flat does not need backward computation. I1106 16:38:10.925776 13633 net.cpp:338] ctx_output4/relu_mbox_conf_perm does not need backward computation. I1106 16:38:10.925777 13633 net.cpp:338] ctx_output4/relu_mbox_conf does not need backward computation. I1106 16:38:10.925781 13633 net.cpp:338] ctx_output4/relu_mbox_loc_flat does not need backward computation. I1106 16:38:10.925783 13633 net.cpp:338] ctx_output4/relu_mbox_loc_perm does not need backward computation. I1106 16:38:10.925786 13633 net.cpp:338] ctx_output4/relu_mbox_loc does not need backward computation. I1106 16:38:10.925794 13633 net.cpp:338] ctx_output3/relu_mbox_priorbox does not need backward computation. I1106 16:38:10.925796 13633 net.cpp:338] ctx_output3/relu_mbox_conf_flat does not need backward computation. I1106 16:38:10.925799 13633 net.cpp:338] ctx_output3/relu_mbox_conf_perm does not need backward computation. I1106 16:38:10.925801 13633 net.cpp:338] ctx_output3/relu_mbox_conf does not need backward computation. I1106 16:38:10.925804 13633 net.cpp:338] ctx_output3/relu_mbox_loc_flat does not need backward computation. I1106 16:38:10.925806 13633 net.cpp:338] ctx_output3/relu_mbox_loc_perm does not need backward computation. I1106 16:38:10.925809 13633 net.cpp:338] ctx_output3/relu_mbox_loc does not need backward computation. I1106 16:38:10.925812 13633 net.cpp:338] ctx_output2/relu_mbox_priorbox does not need backward computation. I1106 16:38:10.925813 13633 net.cpp:338] ctx_output2/relu_mbox_conf_flat does not need backward computation. I1106 16:38:10.925817 13633 net.cpp:338] ctx_output2/relu_mbox_conf_perm does not need backward computation. I1106 16:38:10.925818 13633 net.cpp:338] ctx_output2/relu_mbox_conf does not need backward computation. I1106 16:38:10.925820 13633 net.cpp:338] ctx_output2/relu_mbox_loc_flat does not need backward computation. I1106 16:38:10.925822 13633 net.cpp:338] ctx_output2/relu_mbox_loc_perm does not need backward computation. I1106 16:38:10.925824 13633 net.cpp:338] ctx_output2/relu_mbox_loc does not need backward computation. I1106 16:38:10.925827 13633 net.cpp:338] ctx_output1/relu_mbox_priorbox does not need backward computation. I1106 16:38:10.925829 13633 net.cpp:338] ctx_output1/relu_mbox_conf_flat does not need backward computation. I1106 16:38:10.925832 13633 net.cpp:338] ctx_output1/relu_mbox_conf_perm does not need backward computation. I1106 16:38:10.925833 13633 net.cpp:338] ctx_output1/relu_mbox_conf does not need backward computation. I1106 16:38:10.925837 13633 net.cpp:338] ctx_output1/relu_mbox_loc_flat does not need backward computation. I1106 16:38:10.925839 13633 net.cpp:338] ctx_output1/relu_mbox_loc_perm does not need backward computation. I1106 16:38:10.925840 13633 net.cpp:338] ctx_output1/relu_mbox_loc does not need backward computation. I1106 16:38:10.925843 13633 net.cpp:338] ctx_output5/relu does not need backward computation. I1106 16:38:10.925845 13633 net.cpp:338] ctx_output5 does not need backward computation. I1106 16:38:10.925848 13633 net.cpp:338] ctx_output4_ctx_output4/relu_0_split does not need backward computation. I1106 16:38:10.925850 13633 net.cpp:338] ctx_output4/relu does not need backward computation. I1106 16:38:10.925853 13633 net.cpp:338] ctx_output4 does not need backward computation. I1106 16:38:10.925855 13633 net.cpp:338] ctx_output3_ctx_output3/relu_0_split does not need backward computation. I1106 16:38:10.925858 13633 net.cpp:338] ctx_output3/relu does not need backward computation. I1106 16:38:10.925860 13633 net.cpp:338] ctx_output3 does not need backward computation. I1106 16:38:10.925863 13633 net.cpp:338] ctx_output2_ctx_output2/relu_0_split does not need backward computation. I1106 16:38:10.925866 13633 net.cpp:338] ctx_output2/relu does not need backward computation. I1106 16:38:10.925868 13633 net.cpp:338] ctx_output2 does not need backward computation. I1106 16:38:10.925871 13633 net.cpp:338] ctx_output1_ctx_output1/relu_0_split does not need backward computation. I1106 16:38:10.925873 13633 net.cpp:338] ctx_output1/relu does not need backward computation. I1106 16:38:10.925875 13633 net.cpp:338] ctx_output1 does not need backward computation. I1106 16:38:10.925878 13633 net.cpp:338] pool8 does not need backward computation. I1106 16:38:10.925880 13633 net.cpp:338] pool7_pool7_0_split does not need backward computation. I1106 16:38:10.925884 13633 net.cpp:338] pool7 does not need backward computation. I1106 16:38:10.925886 13633 net.cpp:338] pool6_pool6_0_split does not need backward computation. I1106 16:38:10.925887 13633 net.cpp:338] pool6 does not need backward computation. I1106 16:38:10.925890 13633 net.cpp:338] res5a_branch2b_res5a_branch2b/relu_0_split does not need backward computation. I1106 16:38:10.925897 13633 net.cpp:338] res5a_branch2b/relu does not need backward computation. I1106 16:38:10.925900 13633 net.cpp:338] res5a_branch2b/bn does not need backward computation. I1106 16:38:10.925902 13633 net.cpp:338] res5a_branch2b does not need backward computation. I1106 16:38:10.925904 13633 net.cpp:338] res5a_branch2a/relu does not need backward computation. I1106 16:38:10.925906 13633 net.cpp:338] res5a_branch2a/bn does not need backward computation. I1106 16:38:10.925909 13633 net.cpp:338] res5a_branch2a does not need backward computation. I1106 16:38:10.925911 13633 net.cpp:338] pool4 does not need backward computation. I1106 16:38:10.925913 13633 net.cpp:338] res4a_branch2b_res4a_branch2b/relu_0_split does not need backward computation. I1106 16:38:10.925916 13633 net.cpp:338] res4a_branch2b/relu does not need backward computation. I1106 16:38:10.925920 13633 net.cpp:338] res4a_branch2b/bn does not need backward computation. I1106 16:38:10.925921 13633 net.cpp:338] res4a_branch2b does not need backward computation. I1106 16:38:10.925923 13633 net.cpp:338] res4a_branch2a/relu does not need backward computation. I1106 16:38:10.925925 13633 net.cpp:338] res4a_branch2a/bn does not need backward computation. I1106 16:38:10.925928 13633 net.cpp:338] res4a_branch2a does not need backward computation. I1106 16:38:10.925930 13633 net.cpp:338] pool3 does not need backward computation. I1106 16:38:10.925932 13633 net.cpp:338] res3a_branch2b/relu does not need backward computation. I1106 16:38:10.925935 13633 net.cpp:338] res3a_branch2b/bn does not need backward computation. I1106 16:38:10.925937 13633 net.cpp:338] res3a_branch2b does not need backward computation. I1106 16:38:10.925940 13633 net.cpp:338] res3a_branch2a/relu does not need backward computation. I1106 16:38:10.925942 13633 net.cpp:338] res3a_branch2a/bn does not need backward computation. I1106 16:38:10.925945 13633 net.cpp:338] res3a_branch2a does not need backward computation. I1106 16:38:10.925947 13633 net.cpp:338] pool2 does not need backward computation. I1106 16:38:10.925949 13633 net.cpp:338] res2a_branch2b/relu does not need backward computation. I1106 16:38:10.925951 13633 net.cpp:338] res2a_branch2b/bn does not need backward computation. I1106 16:38:10.925952 13633 net.cpp:338] res2a_branch2b does not need backward computation. I1106 16:38:10.925956 13633 net.cpp:338] res2a_branch2a/relu does not need backward computation. I1106 16:38:10.925957 13633 net.cpp:338] res2a_branch2a/bn does not need backward computation. I1106 16:38:10.925961 13633 net.cpp:338] res2a_branch2a does not need backward computation. I1106 16:38:10.925962 13633 net.cpp:338] pool1 does not need backward computation. I1106 16:38:10.925964 13633 net.cpp:338] conv1b/relu does not need backward computation. I1106 16:38:10.925966 13633 net.cpp:338] conv1b/bn does not need backward computation. I1106 16:38:10.925968 13633 net.cpp:338] conv1b does not need backward computation. I1106 16:38:10.925971 13633 net.cpp:338] conv1a/relu does not need backward computation. I1106 16:38:10.925973 13633 net.cpp:338] conv1a/bn does not need backward computation. I1106 16:38:10.925974 13633 net.cpp:338] conv1a does not need backward computation. I1106 16:38:10.925976 13633 net.cpp:338] data/bias does not need backward computation. I1106 16:38:10.925979 13633 net.cpp:338] data_data_0_split does not need backward computation. I1106 16:38:10.925982 13633 net.cpp:338] data does not need backward computation. I1106 16:38:10.925984 13633 net.cpp:380] This network produces output ctx_output5 I1106 16:38:10.925987 13633 net.cpp:380] This network produces output detection_eval I1106 16:38:10.926048 13633 net.cpp:403] Top memory (TEST) required for data: 1264712584 diff: 1264712584 I1106 16:38:10.926053 13633 net.cpp:406] Bottom memory (TEST) required for data: 1264651104 diff: 1264651104 I1106 16:38:10.926054 13633 net.cpp:409] Shared (in-place) memory (TEST) by data: 622632960 diff: 622632960 I1106 16:38:10.926060 13633 net.cpp:412] Parameters memory (TEST) required for data: 11946688 diff: 11946688 I1106 16:38:10.926062 13633 net.cpp:415] Parameters shared memory (TEST) by data: 0 diff: 0 I1106 16:38:10.926064 13633 net.cpp:421] Network initialization done. F1106 16:38:10.926213 13633 io.cpp:55] Check failed: fd != -1 (-1 vs. -1) File not found: training/ti-custom-cfg1/JDetNet/20191106_16-37_ds_PSP_dsFac_32_hdDS8_1/sparse/ti-custom-cfg1_ssdJacintoNetV2_iter_120000.caffemodel *** Check failure stack trace: *** @ 0x7f69c02ae5cd google::LogMessage::Fail() @ 0x7f69c02b0433 google::LogMessage::SendToLog() @ 0x7f69c02ae15b google::LogMessage::Flush() @ 0x7f69c02b0e1e google::LogMessageFatal::~LogMessageFatal() @ 0x7f69c12bc6dc caffe::ReadProtoFromBinaryFile() @ 0x7f69c1334f56 caffe::ReadNetParamsFromBinaryFileOrDie() @ 0x7f69c0e6b88a caffe::Net::CopyTrainedLayersFromBinaryProto() @ 0x7f69c0e6b92e caffe::Net::CopyTrainedLayersFrom() @ 0x41204c test_detection() @ 0x40d1f0 main @ 0x7f69bea30830 __libc_start_main @ 0x40de89 _start @ (nil) (unknown)