Is there a simple way to define different learning rates for weights and biases in Tensorflow using Adam optimizer?

后端 未结 0 716
故里飘歌
故里飘歌 2020-12-18 13:10

In Caffe we can define two learning rates for each layer: one for weights and one for the bias:

layer {
  name: "conv1"
  type: "Convolution&qu         


        
相关标签:
回答
  • 消灭零回复
提交回复
热议问题