Loading SavedModel is a lot slower than loading a tf.train.Saver checkpoint

不问归期 提交于 2019-12-02 18:00:41

I am by no ways an expert in Tensorflow, but if I had to take a guess as to why this is happening, I would say that:

  • tf.train.Saver(), saves a complete meta-graph. Therefore, all the information needed to perform any operations contained in your graph is already there. All tensorflow needs to do to load the model, is insert the meta-graph into the default/current graph and you're good to go.
  • The SavedModelBuilder() on the other hand, behind the scene creates a language agnostic representation of your operations and variables. Which means that the loading method has to extract all the information, then recreate all the operation and variables from your previous graph, and insert them into the default/current graph.

Depending on the size of your graph, recreating everything that it contained might take some time.

Concerning the second question, as @J H said, if there are no reasons for you to use one strategy over the other, and time is of the essence, then just go with the fastest one.

what can I do to load the model faster?

Switch back to tf.train.Saver, as your question shows no motivations for using SavedModelBuilder, and makes it clear that elapsed time matters to you. Alternatively, an MCVE that reproduced the timing issue would allow others to collaborate with you on profiling, diagnosing, and fixing any perceived performance issue.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!