Dynamic batch size in tensorflow

夙愿已清 提交于 2019-12-24 20:44:49

问题


I have built a classifier using tesnorflow. I generate proposal regions from images and those proposals are individually classified by my classifier.

My problem is that I do not have a constant batch size when evaluating my model. Because every image has a different number of proposals, the number of proposals to be evaluated for every image is not constant.

Right now I have set the batch size to 1, but this is inefficient and limits the processing speed of my classifier.

Below is the placeholder for the input to the model

self.image_op = tf.placeholder(tf.float32, shape=[batch_size, 48, 48, 3], name='input_image')

And this is how I feed the input to the model

def predict(self,image):
    cls_prob = self.sess.run([self.cls_prob], feed_dict={self.image_op: image})
    return cls_prob

Is there any way of setting the batch size to a dynamic value without having to restore the model for every image?


回答1:


You can simply set tf.Variable(validate_shape=False)

This will disable the validation of shape on iterations and therefore you will be able to use dynamic batch sizes.

Since tf.placeholder is being depreciated you should not use it, but if you still want to use tf.placeholder then you need to disable TF 2.x behaviour

import tensorflow.compat.v1 as tf
tf.disable_v2_behavior()


来源:https://stackoverflow.com/questions/53422311/dynamic-batch-size-in-tensorflow

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!