I have a R program that combines 10 files each file is of size 296MB and I have increased the memory size to 8GB (Size of RAM)
--max-mem-size=8192M
>
Memory allocation needs contiguous blocks. The size taken by the file on disk may not be a good index of how large the object is when loaded into R. Can you look at one of these S files with the function:
?object.size
Here is a function I use to see what is taking up the most space in R:
getsizes <- function() {z <- sapply(ls(envir=globalenv()),
function(x) object.size(get(x)))
(tmp <- as.matrix(rev(sort(z))[1:10]))}