Error using Model after using optimize_for_inference.py on frozen graph
问题 I was using tensorflows script optimize_for_inderence.py on the ssd_mobilenet_v1_coco model with following command: python -m tensorflow.python.tools.optimize_for_inference \ --input /path/to/frozen_inference_graph.pb \ --output /path/to/optimized_inference_graph.pb \ --input_names=image_tensor \ --output_names=detection_boxes,detection_scores,num_detections,detection_classes It worked without errors, but if i want to use the created Model .pb file for Tensorboard or for Inference it gives me