Is there a way of determining how much GPU memory is in use by TensorFlow?

后端 未结 4 1652
醉梦人生
醉梦人生 2020-11-30 04:59

Tensorflow tends to preallocate the entire available memory on it\'s GPUs. For debugging, is there a way of telling how much of that memory is actually in use?

4条回答
  •  一个人的身影
    2020-11-30 05:20

    There's some code in tensorflow.contrib.memory_stats that will help with this:

    from tensorflow.contrib.memory_stats.python.ops.memory_stats_ops import BytesInUse
    with tf.device('/device:GPU:0'):  # Replace with device you are interested in
      bytes_in_use = BytesInUse()
    with tf.Session() as sess:
      print(sess.run(bytes_in_use))
    

提交回复
热议问题