Tensorflow Dataset API doubles graph protobuff filesize

被刻印的时光 ゝ 提交于 2019-12-01 06:19:53

I found a solution to my problem using tf.train.SessionRunHook. I create a SessionRunHook object that initialises the iterator after the session is created:

class IteratorInitializerHook(tf.train.SessionRunHook):
    def __init__(self):
        super(IteratorInitializerHook, self).__init__()
        self.iterator_initiliser_func = None

    def after_create_session(self, session, coord):
        self.iterator_initiliser_func(session)

The initializer function is set when creating the Dataset Iterator:

iterator_initiliser_hook.iterator_initiliser_func = \
    lambda sess: sess.run(
        iterator.initializer,
        feed_dict={images_placeholder: images,
                   labels_placeholder: labels})

And I pass in the hook objects to train_monitors and eval_hooks parameters of tf.contrib.learn.Experiment.

The resulting graph.pbtxt file is now only 500K while the .meta files are only 244K.

Full example here.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!