Error tokenizing data. C error: out of memory pandas python, large file csv

前端 未结 4 1622

I have a large csv file of 3.5 go and I want to read it using pandas.

This is my code:

import pandas as pd
tp = pd.read_csv(\'train_2011_2012_2013.csv\',         


        
4条回答
  •  Happy的楠姐
    2021-02-02 01:42

    This error could also be caused by the chunksize=20000000. Decreasing that fixed the issue in my case. In ℕʘʘḆḽḘ's solution chunksize is also decreased which might have done the trick.

提交回复
热议问题