Error: vector memory exhausted (limit reached?) R 3.5.0 macOS

前端 未结 3 2065
爱一瞬间的悲伤
爱一瞬间的悲伤 2020-12-01 03:57

I\'ve been working for a while with a number of large files containing gene expression data, and I\'ve recently run into an issue with loading that data into R, after upgrad

3条回答
  •  温柔的废话
    2020-12-01 04:17

    R 3.5 has a new system limit on for memory allocation. From the release notes:

    The environment variable R_MAX_VSIZE can now be used to specify the maximal vector heap size. On macOS, unless specified by this environment variable, the maximal vector heap size is set to the maximum of 16GB and the available physical memory. This is to avoid having the R process killed when macOS over-commits memory.

    You can override this. You risk overallocating and killing the process, but that is probably what was happening if you hit a hard wall with R 3.4.4 or whatever you were using before.

    Execute the following in Terminal to create a temporary environmental variable R_MAX_VSIZE with value 32GB (change to suit): export R_MAX_VSIZE=32000000000

    Or if you don't want to open Terminal and run that every time you want to start an R session, you can append the same line to your bash profile. Open Terminal and find your bash profile open .bash_profile and, in a text editor, add the line from above.

    You will still have to open Terminal and start R from there. You can run R in the terminal by just executing R or you can open the GUI open -n /Applications/R.app.

    To make this change in an R session use Sys.setenv('R_MAX_VSIZE'=32000000000) and to check the value use Sys.getenv('R_MAX_VSIZE')

提交回复
热议问题