Keras BatchNormalization population parameters update while training in tensorflow

依然范特西╮ 提交于 2019-12-11 07:05:43

问题


I am using Keras 2.0.8 with Tensorflow 1.3.0 in Ubuntu 16.04 with Cuda 8.0 and cuDNN 6.

I am using two BatchNormalization layers( keras layers ) in my model and training using tensorflow pipeline.

I am facing two problems here -

  1. BatchNorm layer population parameters( mean and variance ) are not being updated while training even after setting K.learning_phase to True. As a result, inference is failing completely. I need some advice on how to update these parameters between training steps manually.
  2. Secondly, after saving the trained model using tensorflow saver op, when I try to load it, the results cannot be reproduced. It seems the weights are changing. Is there a way to keep the weights same in save-load operation?

回答1:


I ran into the same problem a few weeks ago. Internally, keras layers can add additional update operations to a model (e.g. batchnorm). So you need to run these additional ops explicitly. For the batchnorm these updates seem to be just some assign_ops which swap the current mean/variance with the new values. If you do not create a keras model this might work; assuming x is a tensor you like to normalize

bn = keras.layers.BatchNormalization()
x = bn(x)

....
sess.run([minimizer_op,bn.updates],K.learning_phase(): 1)

In my workflow, I am creating a keras model (w/o compiling it) and then run the following

model = keras.Model(inputs=inputs, outputs=prediction)
sess.run([minimizer_op,model.updates],K.learning_phase(): 1)

where inputs can be something like

inputs = [keras.layers.Input(tensor=input_variables)]

and outputs is a list of tensorflow tensors. The model seems to aggregate all additional updates operations between inputs and outputs automatically.



来源:https://stackoverflow.com/questions/46598341/keras-batchnormalization-population-parameters-update-while-training-in-tensorfl

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!