How to load a model from an HDF5 file in Keras?

我怕爱的太早我们不能终老 提交于 2019-11-28 15:17:35

load_weights only sets the weights of your network. You still need to define its architecture before calling load_weights:

def create_model():
   model = Sequential()
   model.add(Dense(64, input_dim=14, init='uniform'))
   model.add(LeakyReLU(alpha=0.3))
   model.add(BatchNormalization(epsilon=1e-06, mode=0, momentum=0.9, weights=None))
   model.add(Dropout(0.5)) 
   model.add(Dense(64, init='uniform'))
   model.add(LeakyReLU(alpha=0.3))
   model.add(BatchNormalization(epsilon=1e-06, mode=0, momentum=0.9, weights=None))
   model.add(Dropout(0.5))
   model.add(Dense(2, init='uniform'))
   model.add(Activation('softmax'))
   return model

def train():
   model = create_model()
   sgd = SGD(lr=0.1, decay=1e-6, momentum=0.9, nesterov=True)
   model.compile(loss='binary_crossentropy', optimizer=sgd)

   checkpointer = ModelCheckpoint(filepath="/tmp/weights.hdf5", verbose=1, save_best_only=True)
   model.fit(X_train, y_train, nb_epoch=20, batch_size=16, show_accuracy=True, validation_split=0.2, verbose=2, callbacks=[checkpointer])

def load_trained_model(weights_path):
   model = create_model()
   model.load_weights(weights_path)

If you stored the complete model, not only the weights, in the HDF5 file, then it is as simple as

from keras.models import load_model
model = load_model('model.h5')

See the following sample code on how to Build a basic Keras Neural Net Model, save Model (JSON) & Weights (HDF5) and load them:

# create model
model = Sequential()
model.add(Dense(X.shape[1], input_dim=X.shape[1], activation='relu')) #Input Layer
model.add(Dense(X.shape[1], activation='relu')) #Hidden Layer
model.add(Dense(output_dim, activation='softmax')) #Output Layer

# Compile & Fit model
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
model.fit(X,Y,nb_epoch=5,batch_size=100,verbose=1)    

# serialize model to JSON
model_json = model.to_json()
with open("Data/model.json", "w") as json_file:
    json_file.write(simplejson.dumps(simplejson.loads(model_json), indent=4))

# serialize weights to HDF5
model.save_weights("Data/model.h5")
print("Saved model to disk")

# load json and create model
json_file = open('Data/model.json', 'r')
loaded_model_json = json_file.read()
json_file.close()
loaded_model = model_from_json(loaded_model_json)

# load weights into new model
loaded_model.load_weights("Data/model.h5")
print("Loaded model from disk")

# evaluate loaded model on test data 
# Define X_test & Y_test data first
loaded_model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
score = loaded_model.evaluate(X_test, Y_test, verbose=0)
print ("%s: %.2f%%" % (loaded_model.metrics_names[1], score[1]*100))

According to official documentation https://keras.io/getting-started/faq/#how-can-i-install-hdf5-or-h5py-to-save-my-models-in-keras

you can do :

first test if you have h5py installed by running the

import h5py

if you dont have errors while importing h5py you are good to save:

from keras.models import load_model

model.save('my_model.h5')  # creates a HDF5 file 'my_model.h5'
del model  # deletes the existing model

# returns a compiled model
# identical to the previous one
model = load_model('my_model.h5')

If you need to install h5py http://docs.h5py.org/en/latest/build.html

I done in this way

from keras.models import Sequential
from keras_contrib.losses import import crf_loss
from keras_contrib.metrics import crf_viterbi_accuracy

# To save model
model.save('my_model_01.hdf5')

# To load the model
custom_objects={'CRF': CRF,'crf_loss': crf_loss,'crf_viterbi_accuracy':crf_viterbi_accuracy}

# To load a persisted model that uses the CRF layer 
model1 = load_model("/home/abc/my_model_01.hdf5", custom_objects = custom_objects)
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!