Error using Model after using optimize_for_inference.py on frozen graph

本小妞迷上赌 提交于 2019-12-01 11:30:43

I saw a similar error "Input 0 of node ExpandDims_6 was passed float from input_feed:0 incompatible with expected int64" when loading the model file generated by the optimize_for_inference tool.

Pete's comment "the new graph transform approach to removing unused nodes might be more robust? https://github.com/tensorflow/tensorflow/tree/master/tensorflow/tools/graph_transforms/#optimizing-for-deployment" on https://github.com/tensorflow/tensorflow/issues/8242 seems to suggest we should use the new transform_graph tool.

That there's no mention of the optimize_for_inference tool in the updated TensorFlow Mobile documentation https://www.tensorflow.org/mobile also suggests the transform_graph tool.

Just when I doubted the culprit is in the optimization tool I saw your question. Thanks for that.. Just tried the transform_graph tool, it worked, both with the transformed model and further memmapped model after transformation. Below is the 3 commands freeze-transform-memmapped I used:

python tensorflow/python/tools/freeze_graph.py  \
--input_meta_graph=/tmp/ckpt4.meta \
--input_checkpoint=/tmp/ckpt4 \
--output_graph=/tmp/ckpt4_frozen.pb \
--output_node_names="softmax,lstm/initial_state,lstm/state" \
--input_binary=true

bazel-bin/tensorflow/tools/graph_transforms/transform_graph \
--in_graph=/tmp/ckpt4_frozen.pb \
--out_graph=/tmp/ckpt4_frozen_transformed.pb \
--inputs="convert_image/Cast,input_feed,lstm/state_feed" \
--outputs="softmax,lstm/initial_state,lstm/state" \
--transforms='
      strip_unused_nodes(type=float, shape="1,299,299,3")
      fold_constants(ignore_errors=true) 
      fold_batch_norms
      fold_old_batch_norms'  


bazel-bin/tensorflow/contrib/util/convert_graphdef_memmapped_format \
--in_graph=/tmp/ckpt4_frozen_transformed.pb \
--out_graph=/tmp/ckpt4_frozen_transformed_memmapped.pb
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!