Extend memory size limit in R

后端 未结 4 785
有刺的猬
有刺的猬 2020-12-18 13:55

I have a R program that combines 10 files each file is of size 296MB and I have increased the memory size to 8GB (Size of RAM)

--max-mem-size=8192M
         


        
4条回答
  •  感动是毒
    2020-12-18 14:25

    I suggest incorporating the suggestions in ?read.csv2:

    Memory usage:

     These functions can use a surprising amount of memory when reading
     large files.  There is extensive discussion in the ‘R Data
     Import/Export’ manual, supplementing the notes here.
    
     Less memory will be used if ‘colClasses’ is specified as one of
     the six atomic vector classes.  This can be particularly so when
     reading a column that takes many distinct numeric values, as
     storing each distinct value as a character string can take up to
     14 times as much memory as storing it as an integer.
    
     Using ‘nrows’, even as a mild over-estimate, will help memory
     usage.
    
     Using ‘comment.char = ""’ will be appreciably faster than the
     ‘read.table’ default.
    
     ‘read.table’ is not the right tool for reading large matrices,
     especially those with many columns: it is designed to read _data
     frames_ which may have columns of very different classes.  Use
     ‘scan’ instead for matrices.
    

提交回复
热议问题