This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TDA2PXEVM: What does data layer do with Cityscapes dataset?

Part Number: TDA2PXEVM


Hi,

I am converting cityscapes dataset to lmdb for jsegnet21v2 :

1.Download and unzip gtFine and leftImg8bit

2.execute create_cityscapes_lists.sh

3.execute create_cityscapes_segmentation_lmdb.sh

In step1, I find that each img in leftImg8bit contains 3 different size: 2048x1024 1028x640  640x320.

In step 2/3, image-list.txt contains 3 kinds of size (2048x1024 1028x640  640x320) images, but label-list.txt only has 2048x1024 label map.

Question:

a.Should I only keep 2048x1024 images to match label map?

b.Should I modify the transform_param : crop_size: 640?

  • Hi, Yordan,

    I have followed the document to complete dataset preparation.

    But, I meet the error when training/infering jsegnet:

    I have trained/infered jdnet/resnet/mobilenet successfully. So, I don't understand why this happened.

    How do I solve this problem? 

  • It looks like you have these other resolutions also in leftImg8bit/train or leftImg8bit/val

    folders. Please keep only the original images here and try again.

  • Hi, Manu

    Thanks for your help.

    I have done that to keep same resolutions and I can get the right image and label from lmdb

    Now I am looking into the new training error and log is pasted in replay to Yordan.

    I am trying to solve this error and train JSegNet well. Have you ever met the problem?

  • Hi, I am not able to exactly understand your scenario. Have you done one round of training exactly as given in the documentation (with only 2048x1024 images in the folder). Other have used this training procedure - so I don't see a reason for it to fail. 

    Best regards,

     

  • Hi,
    1.I have followed the documentation to finish data preparation(with only 2048x1024 images ).
    2. When I execute training, program stops at "Createing Layer res5a_branch2a" :
    Check failed: status == CUDNN_STATUS_SUCCESS(9 vs. 0) CUDNN_STATUS_NOT_SUPPORTED

    So, I just run infer.sh with deploy.proto and given caffemodel, that problem still exists.
    Though I have trained JDNET/Resnet/Mobilenet on caffe-jacinto.
    Now, I think it may be caused by my CUDA/CUDNN version:
    CUDA Version 8.0.61
    CUDNN Version 6.0.21
    Do theose versions match for Jacinto?

    Best regards,
  • You may want to try after updating to cudnn7.
    Also, can can attach the full log file? (Did you attach it? I couldn't find it)
  • Thanks for your help so much and I will have another shot after updating cudnn.
    I attach the log file below: 

    Logging output to log/infer-log-2019-05-27_15-54-35.txt
    I0527 15:54:44.202505  3047 gpu_memory.cpp:105] GPUMemory::Manager initialized
    I0527 15:54:44.204417  3047 gpu_memory.cpp:107] Total memory: 12781551616, Free: 12614828032, dev_info[0]: total=12781551616 free=12614828032
    W0527 15:54:44.204493  3047 _caffe.cpp:172] DEPRECATION WARNING - deprecated use of Python interface
    W0527 15:54:44.204627  3047 _caffe.cpp:173] Use this instead (with the named "weights" parameter):
    W0527 15:54:44.204639  3047 _caffe.cpp:175] Net('deploy.prototxt', 1, weights='cityscapes5_jsegnet21v2_iter_120000.caffemodel')
    I0527 15:54:44.209640  3047 upgrade_proto.cpp:66] Attempting to upgrade input file specified using deprecated input fields: deploy.prototxt
    I0527 15:54:44.209677  3047 upgrade_proto.cpp:69] Successfully upgraded file specified using deprecated input fields.
    W0527 15:54:44.209686  3047 upgrade_proto.cpp:71] Note that future Caffe releases will only support input layers and not input fields.
    I0527 15:54:44.231184  3047 net.cpp:80] Initializing net from parameters: 
    name: "jsegnet21v2_deploy"
    state {
      phase: TEST
      level: 0
    }
    layer {
      name: "input"
      type: "Input"
      top: "data"
      input_param {
        shape {
          dim: 1
          dim: 3
          dim: 512
          dim: 1024
        }
      }
    }
    layer {
      name: "data/bias"
      type: "Bias"
      bottom: "data"
      top: "data/bias"
      param {
        lr_mult: 0
        decay_mult: 0
      }
      bias_param {
        filler {
          type: "constant"
          value: -128
        }
      }
    }
    layer {
      name: "conv1a"
      type: "Convolution"
      bottom: "data/bias"
      top: "conv1a"
      param {
        lr_mult: 1
        decay_mult: 1
      }
      param {
        lr_mult: 2
        decay_mult: 0
      }
      convolution_param {
        num_output: 32
        bias_term: true
        pad: 2
        kernel_size: 5
        group: 1
        stride: 2
        weight_filler {
          type: "msra"
        }
        bias_filler {
          type: "constant"
          value: 0
        }
        dilation: 1
      }
    }
    layer {
      name: "conv1a/bn"
      type: "BatchNorm"
      bottom: "conv1a"
      top: "conv1a"
      batch_norm_param {
        moving_average_fraction: 0.99
        eps: 0.0001
        scale_bias: true
      }
    }
    layer {
      name: "conv1a/relu"
      type: "ReLU"
      bottom: "conv1a"
      top: "conv1a"
    }
    layer {
      name: "conv1b"
      type: "Convolution"
      bottom: "conv1a"
      top: "conv1b"
      param {
        lr_mult: 1
        decay_mult: 1
      }
      param {
        lr_mult: 2
        decay_mult: 0
      }
      convolution_param {
        num_output: 32
        bias_term: true
        pad: 1
        kernel_size: 3
        group: 4
        stride: 1
        weight_filler {
          type: "msra"
        }
        bias_filler {
          type: "constant"
          value: 0
        }
        dilation: 1
      }
    }
    layer {
      name: "conv1b/bn"
      type: "BatchNorm"
      bottom: "conv1b"
      top: "conv1b"
      batch_norm_param {
        moving_average_fraction: 0.99
        eps: 0.0001
        scale_bias: true
      }
    }
    layer {
      name: "conv1b/relu"
      type: "ReLU"
      bottom: "conv1b"
      top: "conv1b"
    }
    layer {
      name: "pool1"
      type: "Pooling"
      bottom: "conv1b"
      top: "pool1"
      pooling_param {
        pool: MAX
        kernel_size: 2
        stride: 2
      }
    }
    layer {
      name: "res2a_branch2a"
      type: "Convolution"
      bottom: "pool1"
      top: "res2a_branch2a"
      param {
        lr_mult: 1
        decay_mult: 1
      }
      param {
        lr_mult: 2
        decay_mult: 0
      }
      convolution_param {
        num_output: 64
        bias_term: true
        pad: 1
        kernel_size: 3
        group: 1
        stride: 1
        weight_filler {
          type: "msra"
        }
        bias_filler {
          type: "constant"
          value: 0
        }
        dilation: 1
      }
    }
    layer {
      name: "res2a_branch2a/bn"
      type: "BatchNorm"
      bottom: "res2a_branch2a"
      top: "res2a_branch2a"
      batch_norm_param {
        moving_average_fraction: 0.99
        eps: 0.0001
        scale_bias: true
      }
    }
    layer {
      name: "res2a_branch2a/relu"
      type: "ReLU"
      bottom: "res2a_branch2a"
      top: "res2a_branch2a"
    }
    layer {
      name: "res2a_branch2b"
      type: "Convolution"
      bottom: "res2a_branch2a"
      top: "res2a_branch2b"
      param {
        lr_mult: 1
        decay_mult: 1
      }
      param {
        lr_mult: 2
        decay_mult: 0
      }
      convolution_param {
        num_output: 64
        bias_term: true
        pad: 1
        kernel_size: 3
        group: 4
        stride: 1
        weight_filler {
          type: "msra"
        }
        bias_filler {
          type: "constant"
          value: 0
        }
        dilation: 1
      }
    }
    layer {
      name: "res2a_branch2b/bn"
      type: "BatchNorm"
      bottom: "res2a_branch2b"
      top: "res2a_branch2b"
      batch_norm_param {
        moving_average_fraction: 0.99
        eps: 0.0001
        scale_bias: true
      }
    }
    layer {
      name: "res2a_branch2b/relu"
      type: "ReLU"
      bottom: "res2a_branch2b"
      top: "res2a_branch2b"
    }
    layer {
      name: "pool2"
      type: "Pooling"
      bottom: "res2a_branch2b"
      top: "pool2"
      pooling_param {
        pool: MAX
        kernel_size: 2
        stride: 2
      }
    }
    layer {
      name: "res3a_branch2a"
      type: "Convolution"
      bottom: "pool2"
      top: "res3a_branch2a"
      param {
        lr_mult: 1
        decay_mult: 1
      }
      param {
        lr_mult: 2
        decay_mult: 0
      }
      convolution_param {
        num_output: 128
        bias_term: true
        pad: 1
        kernel_size: 3
        group: 1
        stride: 1
        weight_filler {
          type: "msra"
        }
        bias_filler {
          type: "constant"
          value: 0
        }
        dilation: 1
      }
    }
    layer {
      name: "res3a_branch2a/bn"
      type: "BatchNorm"
      bottom: "res3a_branch2a"
      top: "res3a_branch2a"
      batch_norm_param {
        moving_average_fraction: 0.99
        eps: 0.0001
        scale_bias: true
      }
    }
    layer {
      name: "res3a_branch2a/relu"
      type: "ReLU"
      bottom: "res3a_branch2a"
      top: "res3a_branch2a"
    }
    layer {
      name: "res3a_branch2b"
      type: "Convolution"
      bottom: "res3a_branch2a"
      top: "res3a_branch2b"
      param {
        lr_mult: 1
        decay_mult: 1
      }
      param {
        lr_mult: 2
        decay_mult: 0
      }
      convolution_param {
        num_output: 128
        bias_term: true
        pad: 1
        kernel_size: 3
        group: 4
        stride: 1
        weight_filler {
          type: "msra"
        }
        bias_filler {
          type: "constant"
          value: 0
        }
        dilation: 1
      }
    }
    layer {
      name: "res3a_branch2b/bn"
      type: "BatchNorm"
      bottom: "res3a_branch2b"
      top: "res3a_branch2b"
      batch_norm_param {
        moving_average_fraction: 0.99
        eps: 0.0001
        scale_bias: true
      }
    }
    layer {
      name: "res3a_branch2b/relu"
      type: "ReLU"
      bottom: "res3a_branch2b"
      top: "res3a_branch2b"
    }
    layer {
      name: "pool3"
      type: "Pooling"
      bottom: "res3a_branch2b"
      top: "pool3"
      pooling_param {
        pool: MAX
        kernel_size: 2
        stride: 2
      }
    }
    layer {
      name: "res4a_branch2a"
      type: "Convolution"
      bottom: "pool3"
      top: "res4a_branch2a"
      param {
        lr_mult: 1
        decay_mult: 1
      }
      param {
        lr_mult: 2
        decay_mult: 0
      }
      convolution_param {
        num_output: 256
        bias_term: true
        pad: 1
        kernel_size: 3
        group: 1
        stride: 1
        weight_filler {
          type: "msra"
        }
        bias_filler {
          type: "constant"
          value: 0
        }
        dilation: 1
      }
    }
    layer {
      name: "res4a_branch2a/bn"
      type: "BatchNorm"
      bottom: "res4a_branch2a"
      top: "res4a_branch2a"
      batch_norm_param {
        moving_average_fraction: 0.99
        eps: 0.0001
        scale_bias: true
      }
    }
    layer {
      name: "res4a_branch2a/relu"
      type: "ReLU"
      bottom: "res4a_branch2a"
      top: "res4a_branch2a"
    }
    layer {
      name: "res4a_branch2b"
      type: "Convolution"
      bottom: "res4a_branch2a"
      top: "res4a_branch2b"
      param {
        lr_mult: 1
        decay_mult: 1
      }
      param {
        lr_mult: 2
        decay_mult: 0
      }
      convolution_param {
        num_output: 256
        bias_term: true
        pad: 1
        kernel_size: 3
        group: 4
        stride: 1
        weight_filler {
          type: "msra"
        }
        bias_filler {
          type: "constant"
          value: 0
        }
        dilation: 1
      }
    }
    layer {
      name: "res4a_branch2b/bn"
      type: "BatchNorm"
      bottom: "res4a_branch2b"
      top: "res4a_branch2b"
      batch_norm_param {
        moving_average_fraction: 0.99
        eps: 0.0001
        scale_bias: true
      }
    }
    layer {
      name: "res4a_branch2b/relu"
      type: "ReLU"
      bottom: "res4a_branch2b"
      top: "res4a_branch2b"
    }
    layer {
      name: "pool4"
      type: "Pooling"
      bottom: "res4a_branch2b"
      top: "pool4"
      pooling_param {
        pool: MAX
        kernel_size: 1
        stride: 1
      }
    }
    layer {
      name: "res5a_branch2a"
      type: "Convolution"
      bottom: "pool4"
      top: "res5a_branch2a"
      param {
        lr_mult: 1
        decay_mult: 1
      }
      param {
        lr_mult: 2
        decay_mult: 0
      }
      convolution_param {
        num_output: 512
        bias_term: true
        pad: 2
        kernel_size: 3
        group: 1
        stride: 1
        weight_filler {
          type: "msra"
        }
        bias_filler {
          type: "constant"
          value: 0
        }
        dilation: 2
      }
    }
    layer {
      name: "res5a_branch2a/bn"
      type: "BatchNorm"
      bottom: "res5a_branch2a"
      top: "res5a_branch2a"
      batch_norm_param {
        moving_average_fraction: 0.99
        eps: 0.0001
        scale_bias: true
      }
    }
    layer {
      name: "res5a_branch2a/relu"
      type: "ReLU"
      bottom: "res5a_branch2a"
      top: "res5a_branch2a"
    }
    layer {
      name: "res5a_branch2b"
      type: "Convolution"
      bottom: "res5a_branch2a"
      top: "res5a_branch2b"
      param {
        lr_mult: 1
        decay_mult: 1
      }
      param {
        lr_mult: 2
        decay_mult: 0
      }
      convolution_param {
        num_output: 512
        bias_term: true
        pad: 2
        kernel_size: 3
        group: 4
        stride: 1
        weight_filler {
          type: "msra"
        }
        bias_filler {
          type: "constant"
          value: 0
        }
        dilation: 2
      }
    }
    layer {
      name: "res5a_branch2b/bn"
      type: "BatchNorm"
      bottom: "res5a_branch2b"
      top: "res5a_branch2b"
      batch_norm_param {
        moving_average_fraction: 0.99
        eps: 0.0001
        scale_bias: true
      }
    }
    layer {
      name: "res5a_branch2b/relu"
      type: "ReLU"
      bottom: "res5a_branch2b"
      top: "res5a_branch2b"
    }
    layer {
      name: "out5a"
      type: "Convolution"
      bottom: "res5a_branch2b"
      top: "out5a"
      param {
        lr_mult: 1
        decay_mult: 1
      }
      param {
        lr_mult: 2
        decay_mult: 0
      }
      convolution_param {
        num_output: 64
        bias_term: true
        pad: 4
        kernel_size: 3
        group: 2
        stride: 1
        weight_filler {
          type: "msra"
        }
        bias_filler {
          type: "constant"
          value: 0
        }
        dilation: 4
      }
    }
    layer {
      name: "out5a/bn"
      type: "BatchNorm"
      bottom: "out5a"
      top: "out5a"
      batch_norm_param {
        moving_average_fraction: 0.99
        eps: 0.0001
        scale_bias: true
      }
    }
    layer {
      name: "out5a/relu"
      type: "ReLU"
      bottom: "out5a"
      top: "out5a"
    }
    layer {
      name: "out5a_up2"
      type: "Deconvolution"
      bottom: "out5a"
      top: "out5a_up2"
      param {
        lr_mult: 0
        decay_mult: 0
      }
      convolution_param {
        num_output: 64
        bias_term: false
        pad: 1
        kernel_size: 4
        group: 64
        stride: 2
        weight_filler {
          type: "bilinear"
        }
      }
    }
    layer {
      name: "out3a"
      type: "Convolution"
      bottom: "res3a_branch2b"
      top: "out3a"
      param {
        lr_mult: 1
        decay_mult: 1
      }
      param {
        lr_mult: 2
        decay_mult: 0
      }
      convolution_param {
        num_output: 64
        bias_term: true
        pad: 1
        kernel_size: 3
        group: 2
        stride: 1
        weight_filler {
          type: "msra"
        }
        bias_filler {
          type: "constant"
          value: 0
        }
        dilation: 1
      }
    }
    layer {
      name: "out3a/bn"
      type: "BatchNorm"
      bottom: "out3a"
      top: "out3a"
      batch_norm_param {
        moving_average_fraction: 0.99
        eps: 0.0001
        scale_bias: true
      }
    }
    layer {
      name: "out3a/relu"
      type: "ReLU"
      bottom: "out3a"
      top: "out3a"
    }
    layer {
      name: "out3_out5_combined"
      type: "Eltwise"
      bottom: "out5a_up2"
      bottom: "out3a"
      top: "out3_out5_combined"
    }
    layer {
      name: "ctx_conv1"
      type: "Convolution"
      bottom: "out3_out5_combined"
      top: "ctx_conv1"
      param {
        lr_mult: 1
        decay_mult: 1
      }
      param {
        lr_mult: 2
        decay_mult: 0
      }
      convolution_param {
        num_output: 64
        bias_term: true
        pad: 1
        kernel_size: 3
        group: 1
        stride: 1
        weight_filler {
          type: "msra"
        }
        bias_filler {
          type: "constant"
          value: 0
        }
        dilation: 1
      }
    }
    layer {
      name: "ctx_conv1/bn"
      type: "BatchNorm"
      bottom: "ctx_conv1"
      top: "ctx_conv1"
      batch_norm_param {
        moving_average_fraction: 0.99
        eps: 0.0001
        scale_bias: true
      }
    }
    layer {
      name: "ctx_conv1/relu"
      type: "ReLU"
      bottom: "ctx_conv1"
      top: "ctx_conv1"
    }
    layer {
      name: "ctx_conv2"
      type: "Convolution"
      bottom: "ctx_conv1"
      top: "ctx_conv2"
      param {
        lr_mult: 1
        decay_mult: 1
      }
      param {
        lr_mult: 2
        decay_mult: 0
      }
      convolution_param {
        num_output: 64
        bias_term: true
        pad: 4
        kernel_size: 3
        group: 1
        stride: 1
        weight_filler {
          type: "msra"
        }
        bias_filler {
          type: "constant"
          value: 0
        }
        dilation: 4
      }
    }
    layer {
      name: "ctx_conv2/bn"
      type: "BatchNorm"
      bottom: "ctx_conv2"
      top: "ctx_conv2"
      batch_norm_param {
        moving_average_fraction: 0.99
        eps: 0.0001
        scale_bias: true
      }
    }
    layer {
      name: "ctx_conv2/relu"
      type: "ReLU"
      bottom: "ctx_conv2"
      top: "ctx_conv2"
    }
    layer {
      name: "ctx_conv3"
      type: "Convolution"
      bottom: "ctx_conv2"
      top: "ctx_conv3"
      param {
        lr_mult: 1
        decay_mult: 1
      }
      param {
        lr_mult: 2
        decay_mult: 0
      }
      convolution_param {
        num_output: 64
        bias_term: true
        pad: 4
        kernel_size: 3
        group: 1
        stride: 1
        weight_filler {
          type: "msra"
        }
        bias_filler {
          type: "constant"
          value: 0
        }
        dilation: 4
      }
    }
    layer {
      name: "ctx_conv3/bn"
      type: "BatchNorm"
      bottom: "ctx_conv3"
      top: "ctx_conv3"
      batch_norm_param {
        moving_average_fraction: 0.99
        eps: 0.0001
        scale_bias: true
      }
    }
    layer {
      name: "ctx_conv3/relu"
      type: "ReLU"
      bottom: "ctx_conv3"
      top: "ctx_conv3"
    }
    layer {
      name: "ctx_conv4"
      type: "Convolution"
      bottom: "ctx_conv3"
      top: "ctx_conv4"
      param {
        lr_mult: 1
        decay_mult: 1
      }
      param {
        lr_mult: 2
        decay_mult: 0
      }
      convolution_param {
        num_output: 64
        bias_term: true
        pad: 4
        kernel_size: 3
        group: 1
        stride: 1
        weight_filler {
          type: "msra"
        }
        bias_filler {
          type: "constant"
          value: 0
        }
        dilation: 4
      }
    }
    layer {
      name: "ctx_conv4/bn"
      type: "BatchNorm"
      bottom: "ctx_conv4"
      top: "ctx_conv4"
      batch_norm_param {
        moving_average_fraction: 0.99
        eps: 0.0001
        scale_bias: true
      }
    }
    layer {
      name: "ctx_conv4/relu"
      type: "ReLU"
      bottom: "ctx_conv4"
      top: "ctx_conv4"
    }
    layer {
      name: "ctx_final"
      type: "Convolution"
      bottom: "ctx_conv4"
      top: "ctx_final"
      param {
        lr_mult: 1
        decay_mult: 1
      }
      param {
        lr_mult: 2
        decay_mult: 0
      }
      convolution_param {
        num_output: 8
        bias_term: true
        pad: 1
        kernel_size: 3
        kernel_size: 3
        group: 1
        stride: 1
        weight_filler {
          type: "msra"
        }
        bias_filler {
          type: "constant"
          value: 0
        }
        dilation: 1
      }
    }
    layer {
      name: "ctx_final/relu"
      type: "ReLU"
      bottom: "ctx_final"
      top: "ctx_final"
    }
    layer {
      name: "out_deconv_final_up2"
      type: "Deconvolution"
      bottom: "ctx_final"
      top: "out_deconv_final_up2"
      param {
        lr_mult: 0
        decay_mult: 0
      }
      convolution_param {
        num_output: 8
        bias_term: false
        pad: 1
        kernel_size: 4
        group: 8
        stride: 2
        weight_filler {
          type: "bilinear"
        }
      }
    }
    layer {
      name: "out_deconv_final_up4"
      type: "Deconvolution"
      bottom: "out_deconv_final_up2"
      top: "out_deconv_final_up4"
      param {
        lr_mult: 0
        decay_mult: 0
      }
      convolution_param {
        num_output: 8
        bias_term: false
        pad: 1
        kernel_size: 4
        group: 8
        stride: 2
        weight_filler {
          type: "bilinear"
        }
      }
    }
    layer {
      name: "out_deconv_final_up8"
      type: "Deconvolution"
      bottom: "out_deconv_final_up4"
      top: "out_deconv_final_up8"
      param {
        lr_mult: 0
        decay_mult: 0
      }
      convolution_param {
        num_output: 8
        bias_term: false
        pad: 1
        kernel_size: 4
        group: 8
        stride: 2
        weight_filler {
          type: "bilinear"
        }
      }
    }
    layer {
      name: "argMaxOut"
      type: "ArgMax"
      bottom: "out_deconv_final_up8"
      top: "argMaxOut"
      argmax_param {
        axis: 1
      }
    }
    I0527 15:54:44.231457  3047 net.cpp:110] Using FLOAT as default forward math type
    I0527 15:54:44.231468  3047 net.cpp:116] Using FLOAT as default backward math type
    I0527 15:54:44.231475  3047 layer_factory.hpp:172] Creating layer 'input' of type 'Input'
    I0527 15:54:44.231482  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.231492  3047 net.cpp:200] Created Layer input (0)
    I0527 15:54:44.231496  3047 net.cpp:542] input -> data
    I0527 15:54:44.232815  3047 net.cpp:260] Setting up input
    I0527 15:54:44.232833  3047 net.cpp:267] TEST Top shape for layer 0 'input' 1 3 512 1024 (1572864)
    I0527 15:54:44.232853  3047 layer_factory.hpp:172] Creating layer 'data/bias' of type 'Bias'
    I0527 15:54:44.232861  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.232879  3047 net.cpp:200] Created Layer data/bias (1)
    I0527 15:54:44.232885  3047 net.cpp:572] data/bias <- data
    I0527 15:54:44.232892  3047 net.cpp:542] data/bias -> data/bias
    I0527 15:54:44.234483  3047 net.cpp:260] Setting up data/bias
    I0527 15:54:44.234503  3047 net.cpp:267] TEST Top shape for layer 1 'data/bias' 1 3 512 1024 (1572864)
    I0527 15:54:44.234513  3047 layer_factory.hpp:172] Creating layer 'conv1a' of type 'Convolution'
    I0527 15:54:44.234519  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.234545  3047 net.cpp:200] Created Layer conv1a (2)
    I0527 15:54:44.234551  3047 net.cpp:572] conv1a <- data/bias
    I0527 15:54:44.234557  3047 net.cpp:542] conv1a -> conv1a
    I0527 15:54:44.635936  3047 net.cpp:260] Setting up conv1a
    I0527 15:54:44.635998  3047 net.cpp:267] TEST Top shape for layer 2 'conv1a' 1 32 256 512 (4194304)
    I0527 15:54:44.636023  3047 layer_factory.hpp:172] Creating layer 'conv1a/bn' of type 'BatchNorm'
    I0527 15:54:44.636044  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.636078  3047 net.cpp:200] Created Layer conv1a/bn (3)
    I0527 15:54:44.636085  3047 net.cpp:572] conv1a/bn <- conv1a
    I0527 15:54:44.636095  3047 net.cpp:527] conv1a/bn -> conv1a (in-place)
    I0527 15:54:44.636620  3047 net.cpp:260] Setting up conv1a/bn
    I0527 15:54:44.636636  3047 net.cpp:267] TEST Top shape for layer 3 'conv1a/bn' 1 32 256 512 (4194304)
    I0527 15:54:44.636651  3047 layer_factory.hpp:172] Creating layer 'conv1a/relu' of type 'ReLU'
    I0527 15:54:44.636657  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.636679  3047 net.cpp:200] Created Layer conv1a/relu (4)
    I0527 15:54:44.636685  3047 net.cpp:572] conv1a/relu <- conv1a
    I0527 15:54:44.636690  3047 net.cpp:527] conv1a/relu -> conv1a (in-place)
    I0527 15:54:44.636700  3047 net.cpp:260] Setting up conv1a/relu
    I0527 15:54:44.636705  3047 net.cpp:267] TEST Top shape for layer 4 'conv1a/relu' 1 32 256 512 (4194304)
    I0527 15:54:44.636710  3047 layer_factory.hpp:172] Creating layer 'conv1b' of type 'Convolution'
    I0527 15:54:44.636715  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.636734  3047 net.cpp:200] Created Layer conv1b (5)
    I0527 15:54:44.636739  3047 net.cpp:572] conv1b <- conv1a
    I0527 15:54:44.636744  3047 net.cpp:542] conv1b -> conv1b
    I0527 15:54:44.638417  3047 net.cpp:260] Setting up conv1b
    I0527 15:54:44.638435  3047 net.cpp:267] TEST Top shape for layer 5 'conv1b' 1 32 256 512 (4194304)
    I0527 15:54:44.638447  3047 layer_factory.hpp:172] Creating layer 'conv1b/bn' of type 'BatchNorm'
    I0527 15:54:44.638453  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.638461  3047 net.cpp:200] Created Layer conv1b/bn (6)
    I0527 15:54:44.638466  3047 net.cpp:572] conv1b/bn <- conv1b
    I0527 15:54:44.638471  3047 net.cpp:527] conv1b/bn -> conv1b (in-place)
    I0527 15:54:44.638972  3047 net.cpp:260] Setting up conv1b/bn
    I0527 15:54:44.638988  3047 net.cpp:267] TEST Top shape for layer 6 'conv1b/bn' 1 32 256 512 (4194304)
    I0527 15:54:44.638999  3047 layer_factory.hpp:172] Creating layer 'conv1b/relu' of type 'ReLU'
    I0527 15:54:44.639005  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.639025  3047 net.cpp:200] Created Layer conv1b/relu (7)
    I0527 15:54:44.639029  3047 net.cpp:572] conv1b/relu <- conv1b
    I0527 15:54:44.639035  3047 net.cpp:527] conv1b/relu -> conv1b (in-place)
    I0527 15:54:44.639042  3047 net.cpp:260] Setting up conv1b/relu
    I0527 15:54:44.639048  3047 net.cpp:267] TEST Top shape for layer 7 'conv1b/relu' 1 32 256 512 (4194304)
    I0527 15:54:44.639053  3047 layer_factory.hpp:172] Creating layer 'pool1' of type 'Pooling'
    I0527 15:54:44.639077  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.639089  3047 net.cpp:200] Created Layer pool1 (8)
    I0527 15:54:44.639094  3047 net.cpp:572] pool1 <- conv1b
    I0527 15:54:44.639099  3047 net.cpp:542] pool1 -> pool1
    I0527 15:54:44.639153  3047 net.cpp:260] Setting up pool1
    I0527 15:54:44.639179  3047 net.cpp:267] TEST Top shape for layer 8 'pool1' 1 32 128 256 (1048576)
    I0527 15:54:44.639185  3047 layer_factory.hpp:172] Creating layer 'res2a_branch2a' of type 'Convolution'
    I0527 15:54:44.639190  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.639204  3047 net.cpp:200] Created Layer res2a_branch2a (9)
    I0527 15:54:44.639209  3047 net.cpp:572] res2a_branch2a <- pool1
    I0527 15:54:44.639214  3047 net.cpp:542] res2a_branch2a -> res2a_branch2a
    I0527 15:54:44.643046  3047 net.cpp:260] Setting up res2a_branch2a
    I0527 15:54:44.643066  3047 net.cpp:267] TEST Top shape for layer 9 'res2a_branch2a' 1 64 128 256 (2097152)
    I0527 15:54:44.643079  3047 layer_factory.hpp:172] Creating layer 'res2a_branch2a/bn' of type 'BatchNorm'
    I0527 15:54:44.643085  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.643095  3047 net.cpp:200] Created Layer res2a_branch2a/bn (10)
    I0527 15:54:44.643100  3047 net.cpp:572] res2a_branch2a/bn <- res2a_branch2a
    I0527 15:54:44.643116  3047 net.cpp:527] res2a_branch2a/bn -> res2a_branch2a (in-place)
    I0527 15:54:44.643463  3047 net.cpp:260] Setting up res2a_branch2a/bn
    I0527 15:54:44.643491  3047 net.cpp:267] TEST Top shape for layer 10 'res2a_branch2a/bn' 1 64 128 256 (2097152)
    I0527 15:54:44.643503  3047 layer_factory.hpp:172] Creating layer 'res2a_branch2a/relu' of type 'ReLU'
    I0527 15:54:44.643522  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.643529  3047 net.cpp:200] Created Layer res2a_branch2a/relu (11)
    I0527 15:54:44.643534  3047 net.cpp:572] res2a_branch2a/relu <- res2a_branch2a
    I0527 15:54:44.643539  3047 net.cpp:527] res2a_branch2a/relu -> res2a_branch2a (in-place)
    I0527 15:54:44.643546  3047 net.cpp:260] Setting up res2a_branch2a/relu
    I0527 15:54:44.643553  3047 net.cpp:267] TEST Top shape for layer 11 'res2a_branch2a/relu' 1 64 128 256 (2097152)
    I0527 15:54:44.643558  3047 layer_factory.hpp:172] Creating layer 'res2a_branch2b' of type 'Convolution'
    I0527 15:54:44.643563  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.643575  3047 net.cpp:200] Created Layer res2a_branch2b (12)
    I0527 15:54:44.643580  3047 net.cpp:572] res2a_branch2b <- res2a_branch2a
    I0527 15:54:44.643585  3047 net.cpp:542] res2a_branch2b -> res2a_branch2b
    I0527 15:54:44.646502  3047 net.cpp:260] Setting up res2a_branch2b
    I0527 15:54:44.646522  3047 net.cpp:267] TEST Top shape for layer 12 'res2a_branch2b' 1 64 128 256 (2097152)
    I0527 15:54:44.646533  3047 layer_factory.hpp:172] Creating layer 'res2a_branch2b/bn' of type 'BatchNorm'
    I0527 15:54:44.646551  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.646561  3047 net.cpp:200] Created Layer res2a_branch2b/bn (13)
    I0527 15:54:44.646566  3047 net.cpp:572] res2a_branch2b/bn <- res2a_branch2b
    I0527 15:54:44.646572  3047 net.cpp:527] res2a_branch2b/bn -> res2a_branch2b (in-place)
    I0527 15:54:44.646932  3047 net.cpp:260] Setting up res2a_branch2b/bn
    I0527 15:54:44.646947  3047 net.cpp:267] TEST Top shape for layer 13 'res2a_branch2b/bn' 1 64 128 256 (2097152)
    I0527 15:54:44.646971  3047 layer_factory.hpp:172] Creating layer 'res2a_branch2b/relu' of type 'ReLU'
    I0527 15:54:44.646978  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.646986  3047 net.cpp:200] Created Layer res2a_branch2b/relu (14)
    I0527 15:54:44.646991  3047 net.cpp:572] res2a_branch2b/relu <- res2a_branch2b
    I0527 15:54:44.646996  3047 net.cpp:527] res2a_branch2b/relu -> res2a_branch2b (in-place)
    I0527 15:54:44.647017  3047 net.cpp:260] Setting up res2a_branch2b/relu
    I0527 15:54:44.647023  3047 net.cpp:267] TEST Top shape for layer 14 'res2a_branch2b/relu' 1 64 128 256 (2097152)
    I0527 15:54:44.647028  3047 layer_factory.hpp:172] Creating layer 'pool2' of type 'Pooling'
    I0527 15:54:44.647033  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.647042  3047 net.cpp:200] Created Layer pool2 (15)
    I0527 15:54:44.647047  3047 net.cpp:572] pool2 <- res2a_branch2b
    I0527 15:54:44.647051  3047 net.cpp:542] pool2 -> pool2
    I0527 15:54:44.647100  3047 net.cpp:260] Setting up pool2
    I0527 15:54:44.647109  3047 net.cpp:267] TEST Top shape for layer 15 'pool2' 1 64 64 128 (524288)
    I0527 15:54:44.647114  3047 layer_factory.hpp:172] Creating layer 'res3a_branch2a' of type 'Convolution'
    I0527 15:54:44.647119  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.647133  3047 net.cpp:200] Created Layer res3a_branch2a (16)
    I0527 15:54:44.647138  3047 net.cpp:572] res3a_branch2a <- pool2
    I0527 15:54:44.647143  3047 net.cpp:542] res3a_branch2a -> res3a_branch2a
    I0527 15:54:44.651742  3047 net.cpp:260] Setting up res3a_branch2a
    I0527 15:54:44.651782  3047 net.cpp:267] TEST Top shape for layer 16 'res3a_branch2a' 1 128 64 128 (1048576)
    I0527 15:54:44.651795  3047 layer_factory.hpp:172] Creating layer 'res3a_branch2a/bn' of type 'BatchNorm'
    I0527 15:54:44.651801  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.651827  3047 net.cpp:200] Created Layer res3a_branch2a/bn (17)
    I0527 15:54:44.651834  3047 net.cpp:572] res3a_branch2a/bn <- res3a_branch2a
    I0527 15:54:44.651852  3047 net.cpp:527] res3a_branch2a/bn -> res3a_branch2a (in-place)
    I0527 15:54:44.652199  3047 net.cpp:260] Setting up res3a_branch2a/bn
    I0527 15:54:44.652225  3047 net.cpp:267] TEST Top shape for layer 17 'res3a_branch2a/bn' 1 128 64 128 (1048576)
    I0527 15:54:44.652253  3047 layer_factory.hpp:172] Creating layer 'res3a_branch2a/relu' of type 'ReLU'
    I0527 15:54:44.652259  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.652266  3047 net.cpp:200] Created Layer res3a_branch2a/relu (18)
    I0527 15:54:44.652271  3047 net.cpp:572] res3a_branch2a/relu <- res3a_branch2a
    I0527 15:54:44.652276  3047 net.cpp:527] res3a_branch2a/relu -> res3a_branch2a (in-place)
    I0527 15:54:44.652284  3047 net.cpp:260] Setting up res3a_branch2a/relu
    I0527 15:54:44.652289  3047 net.cpp:267] TEST Top shape for layer 18 'res3a_branch2a/relu' 1 128 64 128 (1048576)
    I0527 15:54:44.652294  3047 layer_factory.hpp:172] Creating layer 'res3a_branch2b' of type 'Convolution'
    I0527 15:54:44.652299  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.652313  3047 net.cpp:200] Created Layer res3a_branch2b (19)
    I0527 15:54:44.652318  3047 net.cpp:572] res3a_branch2b <- res3a_branch2a
    I0527 15:54:44.652323  3047 net.cpp:542] res3a_branch2b -> res3a_branch2b
    I0527 15:54:44.654780  3047 net.cpp:260] Setting up res3a_branch2b
    I0527 15:54:44.654798  3047 net.cpp:267] TEST Top shape for layer 19 'res3a_branch2b' 1 128 64 128 (1048576)
    I0527 15:54:44.654808  3047 layer_factory.hpp:172] Creating layer 'res3a_branch2b/bn' of type 'BatchNorm'
    I0527 15:54:44.654826  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.654836  3047 net.cpp:200] Created Layer res3a_branch2b/bn (20)
    I0527 15:54:44.654841  3047 net.cpp:572] res3a_branch2b/bn <- res3a_branch2b
    I0527 15:54:44.654846  3047 net.cpp:527] res3a_branch2b/bn -> res3a_branch2b (in-place)
    I0527 15:54:44.655151  3047 net.cpp:260] Setting up res3a_branch2b/bn
    I0527 15:54:44.655165  3047 net.cpp:267] TEST Top shape for layer 20 'res3a_branch2b/bn' 1 128 64 128 (1048576)
    I0527 15:54:44.655179  3047 layer_factory.hpp:172] Creating layer 'res3a_branch2b/relu' of type 'ReLU'
    I0527 15:54:44.655184  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.655203  3047 net.cpp:200] Created Layer res3a_branch2b/relu (21)
    I0527 15:54:44.655210  3047 net.cpp:572] res3a_branch2b/relu <- res3a_branch2b
    I0527 15:54:44.655215  3047 net.cpp:527] res3a_branch2b/relu -> res3a_branch2b (in-place)
    I0527 15:54:44.655222  3047 net.cpp:260] Setting up res3a_branch2b/relu
    I0527 15:54:44.655228  3047 net.cpp:267] TEST Top shape for layer 21 'res3a_branch2b/relu' 1 128 64 128 (1048576)
    I0527 15:54:44.655234  3047 layer_factory.hpp:172] Creating layer 'res3a_branch2b_res3a_branch2b/relu_0_split' of type 'Split'
    I0527 15:54:44.655239  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.655246  3047 net.cpp:200] Created Layer res3a_branch2b_res3a_branch2b/relu_0_split (22)
    I0527 15:54:44.655251  3047 net.cpp:572] res3a_branch2b_res3a_branch2b/relu_0_split <- res3a_branch2b
    I0527 15:54:44.655254  3047 net.cpp:542] res3a_branch2b_res3a_branch2b/relu_0_split -> res3a_branch2b_res3a_branch2b/relu_0_split_0
    I0527 15:54:44.655261  3047 net.cpp:542] res3a_branch2b_res3a_branch2b/relu_0_split -> res3a_branch2b_res3a_branch2b/relu_0_split_1
    I0527 15:54:44.655300  3047 net.cpp:260] Setting up res3a_branch2b_res3a_branch2b/relu_0_split
    I0527 15:54:44.655308  3047 net.cpp:267] TEST Top shape for layer 22 'res3a_branch2b_res3a_branch2b/relu_0_split' 1 128 64 128 (1048576)
    I0527 15:54:44.655313  3047 net.cpp:267] TEST Top shape for layer 22 'res3a_branch2b_res3a_branch2b/relu_0_split' 1 128 64 128 (1048576)
    I0527 15:54:44.655318  3047 layer_factory.hpp:172] Creating layer 'pool3' of type 'Pooling'
    I0527 15:54:44.655323  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.655331  3047 net.cpp:200] Created Layer pool3 (23)
    I0527 15:54:44.655336  3047 net.cpp:572] pool3 <- res3a_branch2b_res3a_branch2b/relu_0_split_0
    I0527 15:54:44.655342  3047 net.cpp:542] pool3 -> pool3
    I0527 15:54:44.655387  3047 net.cpp:260] Setting up pool3
    I0527 15:54:44.655395  3047 net.cpp:267] TEST Top shape for layer 23 'pool3' 1 128 32 64 (262144)
    I0527 15:54:44.655400  3047 layer_factory.hpp:172] Creating layer 'res4a_branch2a' of type 'Convolution'
    I0527 15:54:44.655405  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.655418  3047 net.cpp:200] Created Layer res4a_branch2a (24)
    I0527 15:54:44.655423  3047 net.cpp:572] res4a_branch2a <- pool3
    I0527 15:54:44.655429  3047 net.cpp:542] res4a_branch2a -> res4a_branch2a
    I0527 15:54:44.665710  3047 net.cpp:260] Setting up res4a_branch2a
    I0527 15:54:44.665730  3047 net.cpp:267] TEST Top shape for layer 24 'res4a_branch2a' 1 256 32 64 (524288)
    I0527 15:54:44.665741  3047 layer_factory.hpp:172] Creating layer 'res4a_branch2a/bn' of type 'BatchNorm'
    I0527 15:54:44.665760  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.665769  3047 net.cpp:200] Created Layer res4a_branch2a/bn (25)
    I0527 15:54:44.665774  3047 net.cpp:572] res4a_branch2a/bn <- res4a_branch2a
    I0527 15:54:44.665781  3047 net.cpp:527] res4a_branch2a/bn -> res4a_branch2a (in-place)
    I0527 15:54:44.666087  3047 net.cpp:260] Setting up res4a_branch2a/bn
    I0527 15:54:44.666102  3047 net.cpp:267] TEST Top shape for layer 25 'res4a_branch2a/bn' 1 256 32 64 (524288)
    I0527 15:54:44.666115  3047 layer_factory.hpp:172] Creating layer 'res4a_branch2a/relu' of type 'ReLU'
    I0527 15:54:44.666121  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.666127  3047 net.cpp:200] Created Layer res4a_branch2a/relu (26)
    I0527 15:54:44.666131  3047 net.cpp:572] res4a_branch2a/relu <- res4a_branch2a
    I0527 15:54:44.666137  3047 net.cpp:527] res4a_branch2a/relu -> res4a_branch2a (in-place)
    I0527 15:54:44.666144  3047 net.cpp:260] Setting up res4a_branch2a/relu
    I0527 15:54:44.666149  3047 net.cpp:267] TEST Top shape for layer 26 'res4a_branch2a/relu' 1 256 32 64 (524288)
    I0527 15:54:44.666155  3047 layer_factory.hpp:172] Creating layer 'res4a_branch2b' of type 'Convolution'
    I0527 15:54:44.666172  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.666187  3047 net.cpp:200] Created Layer res4a_branch2b (27)
    I0527 15:54:44.666191  3047 net.cpp:572] res4a_branch2b <- res4a_branch2a
    I0527 15:54:44.666198  3047 net.cpp:542] res4a_branch2b -> res4a_branch2b
    I0527 15:54:44.670433  3047 net.cpp:260] Setting up res4a_branch2b
    I0527 15:54:44.670450  3047 net.cpp:267] TEST Top shape for layer 27 'res4a_branch2b' 1 256 32 64 (524288)
    I0527 15:54:44.670459  3047 layer_factory.hpp:172] Creating layer 'res4a_branch2b/bn' of type 'BatchNorm'
    I0527 15:54:44.670464  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.670473  3047 net.cpp:200] Created Layer res4a_branch2b/bn (28)
    I0527 15:54:44.670490  3047 net.cpp:572] res4a_branch2b/bn <- res4a_branch2b
    I0527 15:54:44.670496  3047 net.cpp:527] res4a_branch2b/bn -> res4a_branch2b (in-place)
    I0527 15:54:44.670792  3047 net.cpp:260] Setting up res4a_branch2b/bn
    I0527 15:54:44.670812  3047 net.cpp:267] TEST Top shape for layer 28 'res4a_branch2b/bn' 1 256 32 64 (524288)
    I0527 15:54:44.670825  3047 layer_factory.hpp:172] Creating layer 'res4a_branch2b/relu' of type 'ReLU'
    I0527 15:54:44.670830  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.670836  3047 net.cpp:200] Created Layer res4a_branch2b/relu (29)
    I0527 15:54:44.670841  3047 net.cpp:572] res4a_branch2b/relu <- res4a_branch2b
    I0527 15:54:44.670846  3047 net.cpp:527] res4a_branch2b/relu -> res4a_branch2b (in-place)
    I0527 15:54:44.670855  3047 net.cpp:260] Setting up res4a_branch2b/relu
    I0527 15:54:44.670859  3047 net.cpp:267] TEST Top shape for layer 29 'res4a_branch2b/relu' 1 256 32 64 (524288)
    I0527 15:54:44.670864  3047 layer_factory.hpp:172] Creating layer 'pool4' of type 'Pooling'
    I0527 15:54:44.670869  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.670876  3047 net.cpp:200] Created Layer pool4 (30)
    I0527 15:54:44.670881  3047 net.cpp:572] pool4 <- res4a_branch2b
    I0527 15:54:44.670886  3047 net.cpp:542] pool4 -> pool4
    I0527 15:54:44.670935  3047 net.cpp:260] Setting up pool4
    I0527 15:54:44.670944  3047 net.cpp:267] TEST Top shape for layer 30 'pool4' 1 256 32 64 (524288)
    I0527 15:54:44.670949  3047 layer_factory.hpp:172] Creating layer 'res5a_branch2a' of type 'Convolution'
    I0527 15:54:44.670954  3047 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
    I0527 15:54:44.670969  3047 net.cpp:200] Created Layer res5a_branch2a (31)
    I0527 15:54:44.670974  3047 net.cpp:572] res5a_branch2a <- pool4
    I0527 15:54:44.670979  3047 net.cpp:542] res5a_branch2a -> res5a_branch2a
    F0527 15:54:44.706336  3047 cudnn.hpp:143] Check failed: status == CUDNN_STATUS_SUCCESS (9 vs. 0)  CUDNN_STATUS_NOT_SUPPORTED, device 0
    *** Check failure stack trace: ***
        @     0x7f4d78a51daa  (unknown)
        @     0x7f4d78a51ce4  (unknown)
        @     0x7f4d78a516e6  (unknown)
        @     0x7f4d78a54687  (unknown)
        @     0x7f4d7945bca3  caffe::cudnn::setConvolutionDesc()
        @     0x7f4d7946adfa  caffe::CuDNNConvolutionLayer<>::Reshape()
        @     0x7f4d7975fcd8  caffe::Net::Init()
        @     0x7f4d7976281e  caffe::Net::Net()
        @     0x7f4d7ab110ef  caffe::Net_Init_Load()
        @     0x7f4d7ab315df  boost::python::objects::signature_py_function_impl<>::operator()()
        @     0x7f4d7851962a  (unknown)
        @     0x7f4d78519998  (unknown)
        @     0x7f4d78523c73  (unknown)
        @     0x7f4d785182a3  (unknown)
        @           0x4c2604  (unknown)
        @           0x4d1c5c  (unknown)
        @           0x55f6db  (unknown)
        @           0x5244dd  (unknown)
        @           0x555551  (unknown)
        @           0x525560  (unknown)
        @           0x567d14  (unknown)
        @           0x465bf4  (unknown)
        @           0x46612d  (unknown)
        @           0x466d92  (unknown)
        @     0x7f4ea9a9cf45  (unknown)
        @           0x577c2e  (unknown)
        @              (nil)  (unknown)
    Namespace(batch_size=1, blend=2, class_dict='', crop=['0'], input='test_img/test-img-list.txt', label=None, label_dict='', model='deploy.prototxt', num_classes=None, num_images=2, output='test_img/out', palette='[[0,0,0],[128,64,128],[220,20,60],[250,170,30],[0,0,142],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0],[0,0,0]]', resize=['1024', '512'], resize_back=True, resize_op_to_ip_size=False, search='*.png', weights='cityscapes5_jsegnet21v2_iter_120000.caffemodel')
    Creating palette
    test_img/out
    path test_img/out exists
    ./infer_cityscapes_segmentation.sh: line 34:  3047 Aborted                 (core dumped) ./infer_segmentation.py --crop $crop --resize $resize --model $model --weights $weights --input $input --output $output --num_images $num_images --resize_back --blend 2 --palette="$palette5"
    

  • Hi, 

         Now, the problem solved:

         1. keep only the original images and labels to make lmdb

         2. update cudnn.so and rename it as cudnn.so.5 in /usr/local/lib/

         I can train/infer JSegNet well now. 

         Thanks for you help!

  • Glad to know that it is working. I have two questions:

    1. Which branch of caffe-jacinto are you using? is it caffe-0.17

    2. Which version of cudnn worked for you?

    Thanks,

  • 1.caffe-jacinto-0.17
    2.cuda - 8.0.61 , cudnn - 6.0.21
  • 1.caffe-jacinto-0.17
    2.cudnn 6.0.21