tensorflow store training data on GPU memory

后端 未结 2 1428
小鲜肉
小鲜肉 2020-12-13 21:35

I am pretty new to tensorflow. I used to use theano for deep learning development. I notice a difference between these two, that is where input data can be stored.

I

2条回答
  •  误落风尘
    2020-12-13 22:13

    It is possible, as has been indicated, but make sure that it is actually useful before devoting too much effort to it. At least at present, not every operation has GPU support, and the list of operations without such support includes some common batching and shuffling operations. There may be no advantage to putting your data on GPU if the first stage of processing is to move it to CPU.

    Before trying to refactor code to use on-GPU storage, try at least one of the following:

    1) Start your session with device placement logging to log which ops are executed on which devices:

    config = tf.ConfigProto(log_device_placement=True)
    sess = tf.Session(config=config)
    

    2) Try to manually place your graph on GPU by putting its definition in a with tf.device('/gpu:0'): block. This will throw exceptions if ops are not GPU-supported.

提交回复
热议问题