DNNDK车辆特征提取

时光毁灭记忆、已成空白 提交于 2019-11-27 08:28:54

一、prototxt

car.prototxt:原本的训练文件(A)

deploy.prototxt:GoogLeNet Caffe示例文件(B)

Inception_v1_float.prototxt:Inception v1 DNNDK示例文件(C)

 

1、data层(三者存在差异):

A:
layer {
   name: "data"
   type:"MemoryData"
   top: "data"
   top: "label"
   memory_data_param {
        batch_size: 1
        channels:3
        height: 224
        width: 224
    }
  transform_param {
    crop_size: 224
    mirror: false
  }
}

B:
layer {
  name: "data"
  type: "Input"
  top: "data"
  input_param { shape: { dim: 10 dim: 3 dim: 224 dim: 224 } }
}

C:
layer {
  name: "data"
  type: "ImageData"
  top: "data"
  top: "label"
  include {
    phase: TRAIN
  }
  transform_param {
    mirror: false
    mean_value: 104
    mean_value: 117
    mean_value: 123
  }
  image_data_param {
    source: "./data/imagenet_256/calibration.txt"
    root_folder: "./data/imagenet_256/calibration_images/"
    batch_size: 10
    shuffle: false
    new_height: 224
    new_width: 224
  }
}
View Code

 

2、convolution_param中weight_filter的std属性(A无,B有):

 

3、inception_4a/output之后的层(A比B多出几层):

layer {
  name: "loss1/ave_pool"
  type: "Pooling"
  bottom: "inception_4a/output"
  top: "loss1/ave_pool"
  pooling_param {
    pool: AVE
    kernel_size: 5
    stride: 3
  }
}
layer {
  name: "loss1/conv"
  type: "Convolution"
  bottom: "loss1/ave_pool"
  top: "loss1/conv"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 128
    kernel_size: 1
    weight_filler {
      type: "xavier"
    }
    bias_filler {
      type: "constant"
      value: 0.2
    }
  }
}
layer {
  name: "loss1/relu_conv"
  type: "ReLU"
  bottom: "loss1/conv"
  top: "loss1/conv"
}
layer {
  name: "loss1/fc"
  type: "InnerProduct"
  bottom: "loss1/conv"
  top: "loss1/fc"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  inner_product_param {
    num_output: 1024
    weight_filler {
      type: "xavier"
    }
    bias_filler {
      type: "constant"
      value: 0.2
    }
  }
}
layer {
  name: "loss1/relu_fc"
  type: "ReLU"
  bottom: "loss1/fc"
  top: "loss1/fc"
}
layer {
  name: "loss1/drop_fc"
  type: "Dropout"
  bottom: "loss1/fc"
  top: "loss1/fc"
  dropout_param {
    dropout_ratio: 0.7
  }
}
layer {
  name: "model_loss1/classifier"
  type: "InnerProduct"
  bottom: "loss1/fc"
  top: "model_loss1/classifier"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  inner_product_param {
    num_output: 1232
    weight_filler {
      type: "xavier"
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
View Code

 

4、

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!