Practical limits of R data frame

后端 未结 5 1040
梦如初夏
梦如初夏 2020-11-30 20:21

I have been reading about how read.table is not efficient for large data files. Also how R is not suited for large data sets. So I was wondering where I can find what the pr

5条回答
  •  北荒
    北荒 (楼主)
    2020-11-30 20:52

    R is suited for large data sets, but you may have to change your way of working somewhat from what the introductory textbooks teach you. I did a post on Big Data for R which crunches a 30 GB data set and which you may find useful for inspiration.

    The usual sources for information to get started are High-Performance Computing Task View and the R-SIG HPC mailing list at R-SIG HPC.

    The main limit you have to work around is a historic limit on the length of a vector to 2^31-1 elements which wouldn't be so bad if R did not store matrices as vectors. (The limit is for compatibility with some BLAS libraries.)

    We regularly analyse telco call data records and marketing databases with multi-million customers using R, so would be happy to talk more if you are interested.

提交回复
热议问题