TensorFlow: How to predict from a SavedModel?

前端 未结 4 745
南方客
南方客 2020-12-25 15:41

I have exported a SavedModel and now I with to load it back in and make a prediction. It was trained with the following features and labels:

F1          


        
4条回答
  •  盖世英雄少女心
    2020-12-25 15:45

    For anyone who needs a working example of saving a trained canned model and serving it without tensorflow serving ,I have documented here https://github.com/tettusud/tensorflow-examples/tree/master/estimators

    1. You can create a predictor from tf.tensorflow.contrib.predictor.from_saved_model( exported_model_path)
    2. Prepare input

      tf.train.Example( 
          features= tf.train.Features(
              feature={
                  'x': tf.train.Feature(
                       float_list=tf.train.FloatList(value=[6.4, 3.2, 4.5, 1.5])
                  )     
              }
          )    
      )
      

    Here x is the name of the input that was given in input_receiver_function at the time of exporting. for eg:

    feature_spec = {'x': tf.FixedLenFeature([4],tf.float32)}
    
    def serving_input_receiver_fn():
        serialized_tf_example = tf.placeholder(dtype=tf.string,
                                               shape=[None],
                                               name='input_tensors')
        receiver_tensors = {'inputs': serialized_tf_example}
        features = tf.parse_example(serialized_tf_example, feature_spec)
        return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)
    

提交回复
热议问题