Keras VGG16 fine tuning

前端 未结 3 832
时光说笑
时光说笑 2021-01-18 01:19

There is an example of VGG16 fine-tuning on keras blog, but I can\'t reproduce it.

More precisely, here is code used to init VGG16 without top layer and to freeze a

3条回答
  •  天命终不由人
    2021-01-18 02:04

    I think you can concatenate both by doing something like this:

    #load vgg model
    vgg_model = applications.VGG16(weights='imagenet', include_top=False, input_shape=(150, 150, 3))
    print('Model loaded.')
    
    #initialise top model
    top_model = Sequential()
    top_model.add(Flatten(input_shape=vgg_model.output_shape[1:]))
    top_model.add(Dense(256, activation='relu'))
    top_model.add(Dropout(0.5))
    top_model.add(Dense(1, activation='sigmoid'))
    
    
    top_model.load_weights(top_model_weights_path)
    
    # add the model on top of the convolutional base
    
    model = Model(input= vgg_model.input, output= top_model(vgg_model.output))
    

    This solution refers to the example Fine-tuning the top layers of a a pre-trained network. Full code can be found here.

提交回复
热议问题