Tensorflow classifier.export_savedmodel (Beginner)

前端 未结 3 1881
再見小時候
再見小時候 2020-12-05 01:55

I know about the \"Serving a Tensorflow Model\" page

https://www.tensorflow.org/serving/serving_basic

but those functions assume you\'re using tf.Session() w

3条回答
  •  既然无缘
    2020-12-05 02:04

    There are two possible questions and answers possible. First you encounter a missing session for the DNNClassifier which uses the more higher level estimators API (as opposed to the more low level API's where you manipulate the ops yourself). The nice thing about tensorflow is that all high and low level APIs are more-or-less interoperable, so if you want a session and do something with that session, it is as simple as adding:

    sess = tf.get_default_session()
    

    The you can start hooking in the remainder of the tutorial.

    The second interpretation of your question is, what about the export_savedmodel, well actually export_savedmodel and the sample code from the serving tutorial try to achieve the same goal. When you are training your graph you set up some infrastructure to feed input to the graph (typically batches from a training dataset) however when you switch to 'serving' you will often read your input from somewhere else, and you need some separate infrastructure which replaces the input of the graph used for training. The bottomline is that the serving_input_fn() which you filled with a print should in essence return an input op. This is also said in the documentation:

    serving_input_fn: A function that takes no argument and returns an InputFnOps.

    Hence instead of print("asdf") it should do something similar as adding an input chain (which should be similar to what builder.add_meta_graph_and_variables is also adding).

    Examples of serving_input_fn()'s can for example be found (in the cloudml sample)[https://github.com/GoogleCloudPlatform/cloudml-samples/blob/master/census/customestimator/trainer/model.py#L240]. Such as the following which serves input from JSON:

    def json_serving_input_fn():
      """Build the serving inputs."""
      inputs = {}
      for feat in INPUT_COLUMNS:
        inputs[feat.name] = tf.placeholder(shape=[None], dtype=feat.dtype)
      return tf.estimator.export.ServingInputReceiver(inputs, inputs)
    

提交回复
热议问题