Transfer learning from the pre-trained NASnet network. How to know the number of layers to freeze?

此生再无相见时 提交于 2019-12-01 12:39:32

Considering where the Aux branch begins, I'd try freezing the layers before activation_166. Something like this:

model = NASNetLarge((img_rows, img_cols, img_channels),dropout=0.5, use_auxiliary_branch=True, include_top=True, weights=None, classes=nb_classes)

model.load_weights('weights/NASNet-large.h5', by_name=True, skip_mismatch=True)

# Freeze original layers
model.trainable = True    
set_trainable = False
for layer in model.layers:
  if layer.name == 'activation_166':
    set_trainable = True
  if set_trainable:
    layer.trainable = True
  else:
    layer.trainable = False
  print("layer {} is {}".format(layer.name, '+++trainable' if layer.trainable else '---frozen'))
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!