I\'m getting this error
\'ValueError: Tensor Tensor(\"Placeholder:0\", shape=(1, 1), dtype=int32) is not an element of this graph.\'
Inside
def LoadPredictor(save):
Just after loading the model, add model._make_predict_function()
So the function becomes:
def LoadPredictor(save):
with open(os.path.join(save, 'config.pkl'), 'rb') as f:
saved_args = cPickle.load(f)
with open(os.path.join(save, 'words_vocab.pkl'), 'rb') as f:
words, vocab = cPickle.load(f)
model = Model(saved_args, True)
model._make_predict_function()
return model, words, vocab
Use this line before making models:
keras.backend.clear_session()
This will make a new graph to use in new models.
When you create a Model
, the session hasn't been restored yet. All placeholders, variables and ops that are defined in Model.__init__
are placed in a new graph, which makes itself a default graph inside with
block. This is the key line:
with tf.Graph().as_default():
...
This means that this instance of tf.Graph()
equals to tf.get_default_graph()
instance inside with
block, but not before or after it. From this moment on, there exist two different graphs.
When you later create a session and restore a graph into it, you can't access the previous instance of tf.Graph()
in that session. Here's a short example:
with tf.Graph().as_default() as graph:
var = tf.get_variable("var", shape=[3], initializer=tf.zeros_initializer)
# This works
with tf.Session(graph=graph) as sess:
sess.run(tf.global_variables_initializer())
print(sess.run(var)) # ok because `sess.graph == graph`
# This fails
saver = tf.train.import_meta_graph('/tmp/model.ckpt.meta')
with tf.Session() as sess:
saver.restore(sess, "/tmp/model.ckpt")
print(sess.run(var)) # var is from `graph`, not `sess.graph`!
The best way to deal with this is give names to all nodes, e.g. 'input'
, 'target'
, etc, save the model and then look up the nodes in the restored graph by name, something like this:
saver = tf.train.import_meta_graph('/tmp/model.ckpt.meta')
with tf.Session() as sess:
saver.restore(sess, "/tmp/model.ckpt")
input_data = sess.graph.get_tensor_by_name('input')
target = sess.graph.get_tensor_by_name('target')
This method guarantees that all nodes will be from the graph in session.
For me, this issue was resolved by using Keras' APIs to save and load model. I had more than one models being trained in my code and I had to use the particular model for prediction under a condition.
So I saved the entire model to a HDF5 file after model training
# The '.h5' extension indicates that the model should be saved to HDF5.
model.save('my_model.h5')
and then recreate/reload the saved model at the time of prediction
my_model = tf.keras.models.load_model('my_model.h5')
This helped me get rid of
*Tensor not an element of this graph*
error.
If you are calling the python function that calls Tensorflow from an external module, make sure that you the model isn't being loaded as a global variable or else it may not be loaded in time for usage. This happened to me calling a Tensorflow model from the Flask server.
Try first:
import tensorflow as tf
graph = tf.get_default_graph()
Then, when you need to use predict:
with graph.as_default():
y = model.predict(X)