How do you free up gpu memory?

99封情书 提交于 2019-12-07 16:21:17

问题


When running theano, I get an error: not enough memory. See below. What are some possible actions that can be taken to free up memory? I know I can close applications etc, but I just want see if anyone has other ideas. For example, is it possible to reserve memory?

THEANO_FLAGS=mode=FAST_RUN,device=gpu,floatX=float32 python conv_exp.py Using gpu device 0: GeForce GT 650M Trying to run under a GPU. If this is not desired, then modify network3.py to set the GPU flag to False. Error allocating 156800000 bytes of device memory (out of memory). Driver report 64192512 bytes free and 1073414144 bytes total Traceback (most recent call last): File "conv_exp.py", line 25, in training_data, validation_data, test_data = network3.load_data_shared() File "/Users/xr/courses/deep_learning/con_nn/neural-networks-and-deep-learning/src/network3.py", line 78, in load_data_shared return [shared(training_data), shared(validation_data), shared(test_data)] File "/Users/xr/courses/deep_learning/con_nn/neural-networks-and-deep-learning/src/network3.py", line 74, in shared np.asarray(data[0], dtype=theano.config.floatX), borrow=True) File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/theano/compile/sharedvalue.py", line 208, in shared allow_downcast=allow_downcast, **kwargs) File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/theano/sandbox/cuda/var.py", line 203, in float32_shared_constructor deviceval = type_support_filter(value, type.broadcastable, False, None) MemoryError: ('Error allocating 156800000 bytes of device memory (out of memory).', "you might consider using 'theano.shared(..., borrow=True)'")


回答1:


If borrow is set to true garbage collection is on (default true: config.allow_gc=True) and the video card is not currently being used as a display device (doubtful, since you're using a mobile gpu), the only other options are to reduce the parameters of the network or possibly the batch size of the model. The latter will be especially effective if the model uses dropout or noise-based masks (these will be equal to the number of examples in the batch x the number of parameters dropped out or noised).

Otherwise maybe you could boot to the command prompt to save a few mbs? :/



来源:https://stackoverflow.com/questions/32936166/how-do-you-free-up-gpu-memory

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!