How could I convert onnx model to tensorflow saved model?

耗尽温柔 提交于 2021-02-07 06:56:21

问题


I am trying to use tf-serving to deploy my torch model. I have exported my torch model to onnx. How could I generate the pb model for tf-serving ?


回答1:


Use the onnx/onnx-tensorflow converter tool as a Tensorflow backend for ONNX.

  1. Install onnx-tensorflow: pip install onnx-tf

  2. Convert using the command line tool: onnx-tf convert -t tf -i /path/to/input.onnx -o /path/to/output.pb

Alternatively, you can convert through the python API.

import onnx

from onnx_tf.backend import prepare

onnx_model = onnx.load("input_path")  # load onnx model
tf_rep = prepare(onnx_model)  # prepare tf representation
tf_rep.export_graph("output_path")  # export the model


来源:https://stackoverflow.com/questions/58834684/how-could-i-convert-onnx-model-to-tensorflow-saved-model

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!