问题
I am reading caffe's Layer source code but I got following questions:
- What is
Layer::SetLossWeights
function doing? I know that inside Layer class, there is aloss_
variable, which documents:
The vector that indicates whether each top blob has a non-zero weight in the objective function.
Do they have some relationships ?
- Inside the caffe.proto file, LayerParameter loss_weight is only for loss layers, is that correct?
Thanks very much.
回答1:
- The purpose of loss weight is to combine loss from multiple layers. So
Layer::SetLossWeights
is assigning the loss weight toloss_
variable anddiff blob
which is used inforward
to compute total loss. - As default layers with suffix loss have loss weight 1 and others with 0. But any layer that is able to backpropagate can be given a non-zero loss_weight.
For detail information see caffe loss tutorial.
Edit:
Loss weight
will only change if it is input to another layer that does backprop which is not intended by the authors. As they said for Accuracy layer
in this pull request it will break. The purpose of the diff
in loss layer
is to store loss weight
not store gradient. For more detail you can see this discussion in caffe-users group.
来源:https://stackoverflow.com/questions/43094891/caffe-what-is-setlossweights