Can't change activations in existing Keras model

梦想与她 提交于 2021-02-20 05:44:10

问题


I have a normal VGG16 model with relu activations, i.e.

def VGG_16(weights_path=None):
    model = Sequential()
    model.add(ZeroPadding2D((1, 1),input_shape=(3, 224, 224)))
    model.add(Convolution2D(64, 3, 3, activation='relu'))
    model.add(ZeroPadding2D((1, 1)))
    model.add(Convolution2D(64, 3, 3, activation='relu'))
    model.add(MaxPooling2D((2, 2), strides=(2, 2)))
[...]
    model.add(Flatten())
    model.add(Dense(4096, activation='relu'))
    model.add(Dropout(0.5))
    model.add(Dense(4096, activation='relu'))
    model.add(Dropout(0.5))
    model.add(Dense(1000, activation='softmax'))

    if weights_path:
        model.load_weights(weights_path)

    return model

and I'm instantiating it with existing weights and now want to change all relu activations to softmax (not useful, I know)

model = VGG_16('vgg16_weights.h5')
sgd = SGD(lr=0.1, decay=1e-6, momentum=0.9, nesterov=True)

softmax_act = keras.activations.softmax
for (n, layer) in enumerate(model.layers):
    if 'activation' in layer.get_config() and layer.get_config()['activation'] == 'relu':
        print('replacing #{}: {}, {}'.format(n, layer, layer.activation))
        layer.activation = softmax_act
        print('-> {}'.format(layer.activation))

model.compile(optimizer=sgd, loss='categorical_crossentropy')

Note: model.compile is called after the changes, so the model should still be modifiable I guess.

However, even though the debug-prints correctly say

replacing #1: <keras.layers.convolutional.Convolution2D object at 0x7f7d7c497f50>, <function relu at 0x7f7dbe699a28>
-> <function softmax at 0x7f7d7c4972d0>
[...]

the actual results are identical to the model with relu activations.
Why doesn't Keras use the changed activation function?


回答1:


you might want to use apply_modifications

idx_of_layer_to_change = -1
model.layers[idx_of_layer_to_change].activation = activations.softmax
model = utils.apply_modifications(model)



回答2:


Because setting the activation in a keras layer alone does not actually change the graph we need to save the modified model and load it back:

from keras import activations
from keras.models import load_model

model.layers[-1].activation = activations.example
model.save(some_path)
model = load_model(some_path)



回答3:


The function utils.apply_modifications() did not work for me. It gave me a warning

WARNING:tensorflow:No training configuration found in save file: the model was not compiled. Compile it manually.

I then recompiled the model then it worked. For illustration, I changed all activation to sigmoid. see the example below

from tensorflow.keras.activations import relu,sigmoid,elu
from tensorflow.keras.applications.vgg16 import VGG16
base_model = VGG16(weights='imagenet', include_top=False,pooling='avg',input_shape= 
    (100, 100, 3))
# before if you check 
base_model.get_config() # you will see all activation are relu 
for layer in base_model.layers:
    if (hasattr(layer,'activation'))==True:
         layer.activation = sigmoid
# without compiling you should not see any changes
# when calling base_model.get_config()
# when compiling
base_model.compile(loss="categorical_crossentropy") #it forced me to put the loss
# now you will see the changes when calling
base_model.get_config()


来源:https://stackoverflow.com/questions/43030721/cant-change-activations-in-existing-keras-model

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!