doing PCA on very large data set in R
问题 This question was migrated from Cross Validated because it can be answered on Stack Overflow. Migrated 7 years ago . I have a very large training set (~2Gb) in a CSV file. The file is too large to read directly into memory ( read.csv() brings the computer to a halt) and I would like to reduce the size of the data file using PCA. The problem is that (as far as I can tell) I need to read the file into memory in order to run a PCA algorithm (e.g., princomp() ). I have tried the bigmemory package