Error using Model after using optimize_for_inference.py on frozen graph

坚强是说给别人听的谎言 提交于 2019-12-01 08:25:49

问题


I was using tensorflows script optimize_for_inderence.py on the ssd_mobilenet_v1_coco model with following command:

python -m tensorflow.python.tools.optimize_for_inference \
    --input /path/to/frozen_inference_graph.pb \
    --output /path/to/optimized_inference_graph.pb \
    --input_names=image_tensor \
    --output_names=detection_boxes,detection_scores,num_detections,detection_classes

It worked without errors, but if i want to use the created Model .pb file for Tensorboard or for Inference it gives me following error:

ValueError: graph_def is invalid at node u'ToFloat': Input tensor 'image_tensor:0' Cannot convert a tensor of type float32 to an input of type uint8.

See the original graph visualized by Tensorbaord:

As you can see the node ToFloat comes directly after the image_tensor input

So something apparently went wrong with the optimization. But what?


回答1:


I saw a similar error "Input 0 of node ExpandDims_6 was passed float from input_feed:0 incompatible with expected int64" when loading the model file generated by the optimize_for_inference tool.

Pete's comment "the new graph transform approach to removing unused nodes might be more robust? https://github.com/tensorflow/tensorflow/tree/master/tensorflow/tools/graph_transforms/#optimizing-for-deployment" on https://github.com/tensorflow/tensorflow/issues/8242 seems to suggest we should use the new transform_graph tool.

That there's no mention of the optimize_for_inference tool in the updated TensorFlow Mobile documentation https://www.tensorflow.org/mobile also suggests the transform_graph tool.

Just when I doubted the culprit is in the optimization tool I saw your question. Thanks for that.. Just tried the transform_graph tool, it worked, both with the transformed model and further memmapped model after transformation. Below is the 3 commands freeze-transform-memmapped I used:

python tensorflow/python/tools/freeze_graph.py  \
--input_meta_graph=/tmp/ckpt4.meta \
--input_checkpoint=/tmp/ckpt4 \
--output_graph=/tmp/ckpt4_frozen.pb \
--output_node_names="softmax,lstm/initial_state,lstm/state" \
--input_binary=true

bazel-bin/tensorflow/tools/graph_transforms/transform_graph \
--in_graph=/tmp/ckpt4_frozen.pb \
--out_graph=/tmp/ckpt4_frozen_transformed.pb \
--inputs="convert_image/Cast,input_feed,lstm/state_feed" \
--outputs="softmax,lstm/initial_state,lstm/state" \
--transforms='
      strip_unused_nodes(type=float, shape="1,299,299,3")
      fold_constants(ignore_errors=true) 
      fold_batch_norms
      fold_old_batch_norms'  


bazel-bin/tensorflow/contrib/util/convert_graphdef_memmapped_format \
--in_graph=/tmp/ckpt4_frozen_transformed.pb \
--out_graph=/tmp/ckpt4_frozen_transformed_memmapped.pb


来源:https://stackoverflow.com/questions/48212068/error-using-model-after-using-optimize-for-inference-py-on-frozen-graph

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!