Locally load saved tensorflow model .pb from google cloud machine learning engine

岁酱吖の 提交于 2019-12-04 20:08:59

The format of the model you deployed to the CloudML Engine service is a SavedModel. Loading a SavedModel in Python is fairly simple using the loader module:

import tensorflow as tf

with tf.Session(graph=tf.Graph()) as sess:
   tf.saved_model.loader.load(
       sess,
       [tf.saved_model.tag_constants.SERVING],
       path_to_model)

To perform inference, you're code is almost correct; you will need to make sure that you are feeding a batch to session.run, so just wrap image_data in a list:

# Feed the image_data as input to the graph and get first prediction
softmax_tensor = sess.graph.get_tensor_by_name('conv1/weights:0')

predictions = sess.run(softmax_tensor, \
                       {'DecodeJpeg/contents:0': [image_data]})

# Sort to show labels of first prediction in order of confidence
top_k = predictions[0].argsort()[-len(predictions[0]):][::-1]

for node_id in top_k:
    human_string = label_lines[node_id]
    score = predictions[0][node_id]
    print('%s (score = %.5f)' % (human_string, score))

(Note that, depending on your graph, wrapping your input_data in a list may increase the rank of your predictions tensor, and you would need to adjust the code accordingly).

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!