Look at this strange load/save model situation. I saved variational autoencoder model and its encoder and decoder:
autoencoder.save("autoencoder_save", overwrite=True) encoder.save("encoder_save", overwrite=True) decoder.save("decoder_save", overwrite=T)
After that I loaded all of it from the disk:
autoencoder_disk = load_model("autoencoder_save", custom_objects={'KLDivergenceLayer': KLDivergenceLayer, 'nll': nll}) encoder_disk = load_model("encoder_save", custom_objects={'KLDivergenceLayer': KLDivergenceLayer, 'nll': nll}) decoder_disk = load_model("decoder_save", custom_objects={'KLDivergenceLayer': KLDivergenceLayer, 'nll': nll})
If I try
x_test_encoded = encoder_disk.predict(x_test,batch_size=batch_size) x_test_decoded = decoder_disk.predict(x_test_encoded) print(np.round(x_test_decoded[3]))
Everything works just fine as if I use encoder/decoder from the memory, but if I try
vae = autoencoder_disk.predict(x_test_encoded)
I got
ValueError: Error when checking model : the list of Numpy arrays that you are passing to your model is not the size the model expected. Expected to see 2 array(s) but instead got the following list of 1 arrays:...
although I can predict from the variational autoencoder from the memory. Why autoencoder does not work when it is loaded from the disk?