I read this: Importing a CSV file into a sqlite3 database table using Python
and it seems that everyone suggests using line-by-line reading instead of using bulk .im
Divide your data into chunks on the fly using generator expressions, make inserts inside the transaction. Here's a quote from sqlite optimization FAQ:
Unless already in a transaction, each SQL statement has a new transaction started for it. This is very expensive, since it requires reopening, writing to, and closing the journal file for each statement. This can be avoided by wrapping sequences of SQL statements with BEGIN TRANSACTION; and END TRANSACTION; statements. This speedup is also obtained for statements which don't alter the database.
Here's how your code may look like.
Also, sqlite has an ability to import CSV files.