R reading a huge csv

后端 未结 5 817
慢半拍i
慢半拍i 2020-12-23 23:12

I have a huge csv file. Its size is around 9 gb. I have 16 gb of ram. I followed the advises from the page and implemented them below.

If you get the error          


        
5条回答
  •  旧时难觅i
    2020-12-23 23:22

    You could try splitting your processing over the table. Instead of operating on the whole thing, put the whole operation inside a for loop and do it 16, 32, 64, or however many times you need to. Any values you need for later computation can be saved. This isn't as fast as other posts but it will definitely return.

    x = number_of_rows_in_file / CHUNK_SIZE
    for (i in c(from = 1, to = x, by = 1)) {
        read.csv(con, nrows=CHUNK_SIZE,...)
    }
    

    Hope that helps.

提交回复
热议问题