Python CSV to SQLite

前端 未结 5 803
误落风尘
误落风尘 2020-11-28 05:22

I am \"converting\" a large (~1.6GB) CSV file and inserting specific fields of the CSV into a SQLite database. Essentially my code looks like:

import csv, sq         


        
5条回答
  •  孤城傲影
    2020-11-28 06:02

    Pandas makes it easy to load big files into databases in chunks. Read the CSV file into a Pandas DataFrame and then use the Pandas SQL writer (so Pandas does all the hard work). Here's how to load the data in 100,000 row chunks.

    import pandas as pd
    
    orders = pd.read_csv('path/to/your/file.csv')
    orders.to_sql('orders', conn, if_exists='append', index = False, chunksize=100000)
    

    Modern Pandas versions are very performant. Don't reinvent the wheel. See here for more info.

提交回复
热议问题