My company gets a set of CSV files full of bank account info each month that I need to import into a database. Some of these files can be pretty big. For example, one is abo
I don't like some of the other answers :)
I used to do this at a job.
You write a program to create a big SQL script full of INSERT statements, one per line. Than you run the script. You can save the script for future reference (cheap log). Use gzip and it will shrink the size like 90%.
You don't need any fancy tools and it really doesn't matter what database you are using.
You can do a few hundred Inserts per transaction or all of them in one transaction, it's up to you.
Python is a good language for this, but I'm sure php is fine too.
If you have performance problems some databases like Oracle have a special bulk loading program which is faster than INSERT statements.
You should run out of memory cause you should only be parsing one line at a time. You have no need to hold the whole thing in memory, don't do that!