Add DropOut after loading the weights in Keras

余生长醉 提交于 2021-02-07 04:37:50

问题


I am doing king of transfer learning. What I have done is First train the model with the big datasets and save the weights. Then I train the model with my dataset by freezing the layers. But I see there was some overfitting. So I try to change the dropout of the model and load the weights since the numbers are changing while drop out are changing. I find difficulties to change the dropout.

Directly my question is, Is it possible to change the model's dropout while loading the weights?

my scenario 1 is like that

  1. model defined.
  2. train the model.
  3. list item
  4. save weights.
  5. ...

  6. redefine the dropout others are not changed in the model

  7. load the weights . I got the error.

2nd Scenario

  1. model1 defined.

  2. train the model.

  3. save weights

  4. load model1 weights to model1

  5. ....

  6. model2 defined by changing the dropouts.

  7. try to set the wights of model1 to model 2 using for loop except for the dropout layer. I got an error.

This is the error I got.

 File "/home/sathiyakugan/PycharmProjects/internal-apps/apps/support-tools/EscalationApp/LSTM_Attention_IMDB_New_open.py", line 343, in <module>
    NewModel.layers[i].set_weights(layer.get_weights())
  File "/home/sathiyakugan/PycharmProjects/Python/venv/lib/python3.5/site-packages/keras/engine/base_layer.py", line 1062, in set_weights
    str(weights)[:50] + '...')
ValueError: You called `set_weights(weights)` on layer "lstm_5" with a  weight list of length 1, but the layer was expecting 3 weights. Provided weights: [array([[ 0.      ,  0.      ,  0.      , ...,  0....

What is the right way to go? Since I am new to Keras, I am struggling to go further.


回答1:


I recommend you to load the weights using the function model.load_weights("weights_file.h5") and then try the following:

for layer in model.layers:
    if hasattr(layer, 'rate'):
        layer.rate = 0.5

Since only the Dropout layers have the attribute rate, when you find a layer with this attribute you can change it. Here I use 0.5 as the Dropout probability, you can put the value that you want.

Edit: if you are setting the weights layer by layer you can agregate the above if in your for throught the layers

IMPORTANT: after this you have to compile the model again:

from keras.optimizers import SGD
model.compile(optimizer=SGD(lr=1e-3, momentum=0.9), loss='categorical_crossentropy', metrics=['accuracy'])

Again, the parameters passed here are just for example purpose, so change them accordingly to your problem.



来源:https://stackoverflow.com/questions/52200599/add-dropout-after-loading-the-weights-in-keras

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!