Stopping Gradient back prop through a particular layer in keras

元气小坏坏 提交于 2019-12-24 07:47:12

问题


x = Conv2D(768, (3, 3), padding='same', activation='relu', kernel_initializer='normal', 
           name='rpn_conv1',trainable=trainable)(base_layers)

x_class = Conv2D(num_anchors, (1, 1), activation='sigmoid', kernel_initializer='uniform', 
                 name='rpn_out_class',trainable=trainable)(x)

    # stop gradient backflow through regression layer
x_regr = Conv2D(num_anchors * 4, (1, 1), activation='linear', kernel_initializer='zero', 
                name='rpn_out_regress',trainable=trainable)(x)

How to use K.stop_gradient() to stop gradient back-prop via the regression layer (x_reg) alone?


回答1:


You need a Lambda layer for using custom functions.

x_regr_constant = Lambda(
                          lambda x: K.stop_gradient(x), 
                          output_shape=notNecessaryWithTensorflow
                        )(x_regr)


来源:https://stackoverflow.com/questions/47546100/stopping-gradient-back-prop-through-a-particular-layer-in-keras

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!