Cuda and pytorch memory usage

。_饼干妹妹 提交于 2021-02-11 16:41:29

问题


I am using Cuda and Pytorch:1.4.0.

When I try to increase batch_size, I've got the following error:

CUDA out of memory. Tried to allocate 20.00 MiB (GPU 0; 4.00 GiB total capacity; 2.74 GiB already allocated; 7.80 MiB free; 2.96 GiB reserved in total by PyTorch)

I haven't found anything about Pytorch memory usage.

Also, I don't understand why I have only 7.80 mib available?

Should I just use a videocard with better perfomance, or can I free some memory? FYI, I have a GTX 1050 TI, python 3,7 and torch==1.4.0 and my os is Windows 10.


回答1:


I had the same problem, the following worked for me:

torch.cuda.empty_cache()
# start training from here

Even after this if you get the error, then you should decrease the batch_size



来源:https://stackoverflow.com/questions/60276672/cuda-and-pytorch-memory-usage

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!