Is there a way of determining how much GPU memory is in use by TensorFlow?

后端 未结 4 1644
醉梦人生
醉梦人生 2020-11-30 04:59

Tensorflow tends to preallocate the entire available memory on it\'s GPUs. For debugging, is there a way of telling how much of that memory is actually in use?

4条回答
  •  渐次进展
    2020-11-30 05:25

    Here's a practical solution that worked well for me:

    Disable GPU memory pre-allocation using TF session configuration:

    config = tf.ConfigProto()  
    config.gpu_options.allow_growth=True  
    sess = tf.Session(config=config)  
    

    run nvidia-smi -l (or some other utility) to monitor GPU memory consumption.

    Step through your code with the debugger until you see the unexpected GPU memory consumption.

提交回复
热议问题