Increasing memory limit in R for Mac

随声附和 提交于 2020-02-02 12:25:14

问题


I have been looking at solutions posted online on how to increase the memory limit for R but these solutions seem to only work for windows or linux systems.

I am using Mac Mojave version 10.14.5, 8GB memory, 2.3 GHz Intel Core i5. My R studio is 64bit, version 1.1.453

Here's the report from the gc function:

> gc()
           used  (Mb) gc trigger   (Mb) limit (Mb)  max used   (Mb)
Ncells  6453699 344.7   11897884  635.5         NA  11897884  635.5
Vcells 44221701 337.4  179064532 1366.2       7168 219267441 1672.9

I am wondering why the limit for the Ncells and Vcells are so low -- 635.5Mb and 1672.9Mb? Does this mean R is only currently using that amount of memory? This is my suspicion and so I want to increase its limit.

What I am trying to do is: Merge a dataframe with 227,795 rows with another dataframe that has the same number of rows but with different columns. This is giving me an error:

Error: vector memory exhausted (limit reached?) 

This error is also occurring when I try to build a large matrix of distances between 227,796 sets of coordinates.

Does anyone have any solutions to increase R's memory limit in mac? It would be great if there is a memory.limit() version for Mac.

来源:https://stackoverflow.com/questions/56737430/increasing-memory-limit-in-r-for-mac

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!