tensorflow store training data on GPU memory

后端 未结 2 1430
小鲜肉
小鲜肉 2020-12-13 21:35

I am pretty new to tensorflow. I used to use theano for deep learning development. I notice a difference between these two, that is where input data can be stored.

I

2条回答
  •  感动是毒
    2020-12-13 22:29

    If your data fits on the GPU, you can load it into a constant on GPU from e.g. a numpy array:

    with tf.device('/gpu:0'):
      tensorflow_dataset = tf.constant(numpy_dataset)
    

    One way to extract minibatches would be to slice that array at each step instead of feeding it using tf.slice:

      batch = tf.slice(tensorflow_dataset, [index, 0], [batch_size, -1])
    

    There are many possible variations around that theme, including using queues to prefetch the data to GPU dynamically.

提交回复
热议问题