I have exported a SavedModel and now I with to load it back in and make a prediction. It was trained with the following features and labels:
F1
For anyone who needs a working example of saving a trained canned model and serving it without tensorflow serving ,I have documented here https://github.com/tettusud/tensorflow-examples/tree/master/estimators
tf.tensorflow.contrib.predictor.from_saved_model( exported_model_path)Prepare input
tf.train.Example(
features= tf.train.Features(
feature={
'x': tf.train.Feature(
float_list=tf.train.FloatList(value=[6.4, 3.2, 4.5, 1.5])
)
}
)
)
Here x is the name of the input that was given in input_receiver_function at the time of exporting.
for eg:
feature_spec = {'x': tf.FixedLenFeature([4],tf.float32)}
def serving_input_receiver_fn():
serialized_tf_example = tf.placeholder(dtype=tf.string,
shape=[None],
name='input_tensors')
receiver_tensors = {'inputs': serialized_tf_example}
features = tf.parse_example(serialized_tf_example, feature_spec)
return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)