How do i convert tensorflow 2.0 estimator model to tensorflow lite?

喜你入骨 提交于 2020-02-02 03:06:42

问题


THe following code i have below produce the regular tensorflow model but when i try to convert it to tensorflow lite it doesn't work, i followed the following documentations.

https://www.tensorflow.org/tutorials/estimator/linear1 https://www.tensorflow.org/lite/guide/get_started

export_dir = "tmp"
serving_input_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(
  tf.feature_column.make_parse_example_spec(feat_cols))

estimator.export_saved_model(export_dir, serving_input_fn)

# Convert the model.
converter = tf.lite.TFLiteConverter.from_saved_model("tmp/1571728920/saved_model.pb")
tflite_model = converter.convert()

Error Message

Traceback (most recent call last):
  File "C:/Users/Dacorie Smith/PycharmProjects/JamaicaClassOneNotifableModels/ClassOneModels.py", line 208, in <module>
    tflite_model = converter.convert()
  File "C:\Users\Dacorie Smith\PycharmProjects\JamaicaClassOneNotifableModels\venv\lib\site-packages\tensorflow_core\lite\python\lite.py", line 400, in convert
    raise ValueError("This converter can only convert a single "
ValueError: This converter can only convert a single ConcreteFunction. Converting multiple functions is under development.

Extract from Documentation

TensorFlow Lite converter The TensorFlow Lite converter is a tool available as a Python API that converts trained TensorFlow models into the TensorFlow Lite format. It can also introduce optimizations, which are covered in section 4, Optimize your model.

The following example shows a TensorFlow SavedModel being converted into the TensorFlow Lite format:

import tensorflow as tf

converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir) tflite_model = converter.convert() open("converted_model.tflite", "wb").write(tflite_model)


回答1:


Try to use a concrete function:

export_dir = "tmp"
serving_input_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(
  tf.feature_column.make_parse_example_spec(feat_cols))

estimator.export_saved_model(export_dir, serving_input_fn)

# Convert the model.
saved_model_obj = tf.saved_model.load(export_dir="tmp/1571728920/")
concrete_func = saved_model_obj.signatures['serving_default']

converter = tf.lite.TFLiteConverter.from_concrete_functions([concrete_func])

# print(saved_model_obj.signatures.keys())
# converter.optimizations = [tf.lite.Optimize.DEFAULT]
# converter.experimental_new_converter = True

tflite_model = converter.convert()

serving_default is the default key for signatures in a SavedModels.

If not working try to uncomment converter.experimental_new_converter = True and the two lines above it.

Short explanation

Based on Concrete functions guide

Eager execution in TensorFlow 2 evaluates operations immediately, without building graphs. To save the model you need graph/s which is wrapped in a python callables: a concrete functions.



来源:https://stackoverflow.com/questions/58499146/how-do-i-convert-tensorflow-2-0-estimator-model-to-tensorflow-lite

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!