I have a 7GB csv file which I\'d like to split into smaller chunks, so it is readable and faster for analysis in Python on a notebook. I would like to grab a sm
csv
I had to do a similar task, and used the pandas package:
for i,chunk in enumerate(pd.read_csv('bigfile.csv', chunksize=500000)): chunk.to_csv('chunk{}.csv'.format(i), index=False)