问题
I have a network. In one place I want to use concat. As on this picture.
Unfortunately, the network doesn't train. To understand why I want to change weights in concat. Meaning that all values from FC4096 will get 1 and all values from FC16000 will get 0 at the beginning.
I know that FC4096 will get me 57% accuracy, so with learning rate 10^-6 I will understand why after concatenation layers didn't learn.
The question is, how can I set all values from FC4096 to 1 and all values from FC16000 to 0?
回答1:
You can add a "Scale" layer on top of FC16000
and init it to 0:
layer {
name: "scale16000"
type: "Scale"
bottom: "fc16000"
top: "fc16000" # not 100% sure this layer can work in-place, worth trying though.
scale_param {
bias_term: false
filler: { type: "constant" value: 0 }
}
param { lr_mult: 0 decay_mult: 0 } # set mult to non zero if you want to train this scale
}
来源:https://stackoverflow.com/questions/44761118/caffe-setting-custom-weights-in-layer