Efficient alternatives to merge for larger data.frames R

前端 未结 3 1886
花落未央
花落未央 2020-12-02 11:27

I am looking for an efficient (both computer resource wise and learning/implementation wise) method to merge two larger (size>1 million / 300 KB RData file) data frames.

3条回答
  •  感动是毒
    2020-12-02 11:57

    Here are some timings for the data.table vs. data.frame methods.
    Using data.table is very much faster. Regarding memory, I can informally report that the two methods are very similar (within 20%) in RAM use.

    library(data.table)
    
    set.seed(1234)
    n = 1e6
    
    data_frame_1 = data.frame(id=paste("id_", 1:n, sep=""),
                              factor1=sample(c("A", "B", "C"), n, replace=TRUE))
    data_frame_2 = data.frame(id=sample(data_frame_1$id),
                              value1=rnorm(n))
    
    data_table_1 = data.table(data_frame_1, key="id")
    data_table_2 = data.table(data_frame_2, key="id")
    
    system.time(df.merged <- merge(data_frame_1, data_frame_2))
    #   user  system elapsed 
    # 17.983   0.189  18.063 
    
    
    system.time(dt.merged <- merge(data_table_1, data_table_2))
    #   user  system elapsed 
    #  0.729   0.099   0.821 
    

提交回复
热议问题