Caffe, setting custom weights in layer

混江龙づ霸主 提交于 2019-12-10 23:18:09

问题


I have a network. In one place I want to use concat. As on this picture.

Unfortunately, the network doesn't train. To understand why I want to change weights in concat. Meaning that all values from FC4096 will get 1 and all values from FC16000 will get 0 at the beginning.

I know that FC4096 will get me 57% accuracy, so with learning rate 10^-6 I will understand why after concatenation layers didn't learn.

The question is, how can I set all values from FC4096 to 1 and all values from FC16000 to 0?


回答1:


You can add a "Scale" layer on top of FC16000 and init it to 0:

layer {
  name: "scale16000"
  type: "Scale"
  bottom: "fc16000"
  top: "fc16000"  # not 100% sure this layer can work in-place, worth trying though.
  scale_param {
    bias_term: false
    filler: { type: "constant" value: 0 }
  }
  param { lr_mult: 0 decay_mult: 0 } # set mult to non zero if you want to train this scale
}


来源:https://stackoverflow.com/questions/44761118/caffe-setting-custom-weights-in-layer

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!