I have a function which produces feature and target tensors. E.g.
x,t = myfunc() ##x,t tensors
How can I integrate this with TensorFlow's dataset API for continuous training? Ideally I would like to use dataset to set things like batch, transformations.
Edit for clarification: The problem being I would like to not just put x and t in my graph but make a dataset from them so that I can use the same dataset processing that I have implemented for (normal) finite datasets I can load into memory and feed into the same graph using an initializable iterator.
Assuming x
and t
are tf.Tensor
objects, and my_func()
builds a TensorFlow graph, you may be able to use the following approach with `Dataset.map():
# Creates an infinite dataset with a dummy value. You can make this finite by
# specifying an explicit number of elements to `repeat()`.
dummy_dataset = tf.data.Dataset.from_tensors(0).repeat(None)
# Evaluates `my_func` once for each element in `dummy_dataset`.
dataset = dummy_dataset.map(lambda _: my_func())
If x and t are tensors, you can create a dataset by calling tf.data.Dataset.from_tensors
or tf.data.Dataset.from_tensor_slices
(documentation here).
The difference between them is that from_tensors
combines the input tensors into a single element in the dataset. from_tensor_slices
creates a dataset with one element for each slice.
来源:https://stackoverflow.com/questions/47318734/on-the-fly-generation-with-dataset-api-tensorflow