Python: Fast and efficient way of writing large text file

前端 未结 3 1533
心在旅途
心在旅途 2020-12-11 06:33

I have a speed/efficiency related question about python:

I need to write a large number of very large R dataframe-ish files, about 0.5-2 GB sizes. This is basically

3条回答
  •  臣服心动
    2020-12-11 07:19

    Seems like Pandas might be a good tool for this problem. It's pretty easy to get started with pandas, and it deals well with most ways you might need to get data into python. Pandas deals well with mixed data (floats, ints, strings), and usually can detect the types on its own.

    Once you have an (R-like) data frame in pandas, it's pretty straightforward to output the frame to csv.

    DataFrame.to_csv(path_or_buf, sep='\t')
    

    There's a bunch of other configuration things you can do to make your tab separated file just right.

    http://pandas.pydata.org/pandas-docs/stable/generated/pandas.DataFrame.to_csv.html

提交回复
热议问题