I am trying to process a tensor of variable size, in a python way that would be something like:
# X is of shape [m, n]
for x in X:
process(x)
Most of tensorflow built-in functions could be applied elementwise. So you could just pass a tensor into a function. Like:
outer_loop = inner_loop(x)
However, if you have some function that could not be applied this way (it's really tempting to see that function), you could use map_fn
.
Say, your function simply adds 1 to every element of a tensor (or whatever):
inputs = tf.placeholder...
def my_elementwise_func(x):
return x + 1
def recursive_map(inputs):
if tf.shape(inputs).ndims > 0:
return tf.map_fn(recursive_map, inputs)
else:
return my_elementwise_func(inputs)
result = recursive_map(inputs)
To loop over a tensor you could try tf.unstack
Unpacks the given dimension of a rank-R tensor into rank-(R-1) tensors.
So adding 1 to each tensor would look something like:
import tensorflow as tf
x = tf.placeholder(tf.float32, shape=(None, 10))
x_unpacked = tf.unstack(x) # defaults to axis 0, returns a list of tensors
processed = [] # this will be the list of processed tensors
for t in x_unpacked:
# do whatever
result_tensor = t + 1
processed.append(result_tensor)
output = tf.concat(processed, 0)
with tf.Session() as sess:
print(sess.run([output], feed_dict={x: np.zeros((5, 10))}))
Obviously you can further unpack each tensor from the list to process it, down to single elements. To avoid lots of nested unpacking though, you could maybe try flattening x with tf.reshape(x, [-1])
first, and then loop over it like
flattened_unpacked = tf.unstack(tf.reshape(x, [-1])
for elem in flattened_unpacked:
process(elem)
In this case elem
is a scalar.