Create SavedModel for BERT

怎甘沉沦 提交于 2021-01-29 18:20:39

问题


I'm using this Colab for BERT model.

In last cells in order to make predictions we have:

def getPrediction(in_sentences):
  labels = ["Negative", "Positive"]
  input_examples = [run_classifier.InputExample(guid="", text_a = x, text_b = None, label = 0) for x in in_sentences] # here, "" is just a dummy label
  input_features = run_classifier.convert_examples_to_features(input_examples, label_list, MAX_SEQ_LENGTH, tokenizer)
  predict_input_fn = run_classifier.input_fn_builder(features=input_features, seq_length=MAX_SEQ_LENGTH, is_training=False, drop_remainder=False)
  predictions = estimator.predict(predict_input_fn)
  return [(sentence, prediction['probabilities'], labels[prediction['labels']]) for sentence, prediction in zip(in_sentences, predictions)]

pred_sentences = [
  "That movie was absolutely awful",
  "The acting was a bit lacking",
  "The film was creative and surprising",
  "Absolutely fantastic!"
]

predictions = getPrediction(pred_sentences)

I want to create a 'SavedModel' to be used with TF serving. How to create a SavedModel for this model?

Normally I would define the following:

def serving_input_fn():
    """Create serving input function to be able to serve predictions later
    using provided inputs
    :return:
    """
    feature_placeholders = {
        'sentence': tf.placeholder(tf.string, [None]),     
    }
    return tf.estimator.export.ServingInputReceiver(feature_placeholders,
                                                    feature_placeholders)


latest_ckpt = tf.train.latest_checkpoint(OUTPUT_DIR)

last_eval = estimator.evaluate(input_fn=test_input_fn, steps=None, checkpoint_path=latest_ckpt)

# Export the model to GCS for serving.
exporter = tf.estimator.LatestExporter('exporter', serving_input_fn, exports_to_keep=None)
exporter.export(estimator, OUTPUT_DIR, latest_ckpt, last_eval, is_the_final_export=True)      

Not sure how to define my tf.estimator.export.ServingInputReceiver


回答1:


If you look at create_model function present in notebook. It takes some arguments. These are the features which will be passed to the model.

You need to update the serving_input_fn function to include them.

def serving_input_fn():
  feature_spec = {
      "input_ids" : tf.FixedLenFeature([MAX_SEQ_LENGTH], tf.int64),
      "input_mask" : tf.FixedLenFeature([MAX_SEQ_LENGTH], tf.int64),
      "segment_ids" : tf.FixedLenFeature([MAX_SEQ_LENGTH], tf.int64),
      "label_ids" :  tf.FixedLenFeature([], tf.int64)

  }
  serialized_tf_example = tf.placeholder(dtype=tf.string, 
                                         shape=[None],
                                         name='input_example_tensor')
  receiver_tensors = {'example': serialized_tf_example}
  features = tf.parse_example(serialized_tf_example, feature_spec)
  return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)


来源:https://stackoverflow.com/questions/56552018/create-savedmodel-for-bert

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!