My company gets a set of CSV files full of bank account info each month that I need to import into a database. Some of these files can be pretty big. For example, one is abo
First: 33MB is not big. MySQL can easily handle data of this size.
As you noticed, row-by-row insertion is slow. Using an ORM on top of that is even slower: there's overhead for building objects, serialization, and so on. Using an ORM to do this across 35 tables is even slower. Don't do this.
You can indeed use LOAD DATA INFILE; just write a script that transforms your data into the desired format, separating it into per-table files in the process. You can then LOAD each file into the proper table. This script can be written in any language.
Aside from that, bulk INSERT (column, ...) VALUES ... also works. Don't guess what your row batch size should be; time it empirically, as the optimal batch size will depend on your particular database setup (server configuration, column types, indices, etc.)
Bulk INSERT is not going to be as fast as LOAD DATA INFILE, and you'll still have to write a script to transform raw data into usable INSERT queries. For this reason, I'd probably do LOAD DATA INFILE if at all possible.