R memory management / cannot allocate vector of size n Mb

后端 未结 8 2123
攒了一身酷
攒了一身酷 2020-11-21 07:18

I am running into issues trying to use large objects in R. For example:

> memory.limit(4000)
> a = matrix(NA, 1500000, 60)
> a = matrix(NA, 2500000,         


        
8条回答
  •  没有蜡笔的小新
    2020-11-21 08:01

    For Windows users, the following helped me a lot to understand some memory limitations:

    • before opening R, open the Windows Resource Monitor (Ctrl-Alt-Delete / Start Task Manager / Performance tab / click on bottom button 'Resource Monitor' / Memory tab)
    • you will see how much RAM memory us already used before you open R, and by which applications. In my case, 1.6 GB of the total 4GB are used. So I will only be able to get 2.4 GB for R, but now comes the worse...
    • open R and create a data set of 1.5 GB, then reduce its size to 0.5 GB, the Resource Monitor shows my RAM is used at nearly 95%.
    • use gc() to do garbage collection => it works, I can see the memory use go down to 2 GB

    enter image description here

    Additional advice that works on my machine:

    • prepare the features, save as an RData file, close R, re-open R, and load the train features. The Resource Manager typically shows a lower Memory usage, which means that even gc() does not recover all possible memory and closing/re-opening R works the best to start with maximum memory available.
    • the other trick is to only load train set for training (do not load the test set, which can typically be half the size of train set). The training phase can use memory to the maximum (100%), so anything available is useful. All this is to take with a grain of salt as I am experimenting with R memory limits.

提交回复
热议问题