I\'m currently working on creating an environment to test performance of an app; I\'m testing with MySQL and InnoDB to find out which can serve us best. Within this environm
Did you try the Bulk Data Loading Tips from the InnoDB Performance Tuning Tips (especially the first one):
When importing data into
InnoDB, make sure that MySQL does not have autocommit mode enabled because that requires a log flush to disk for every insert. To disable autocommit during your import operation, surround it withSET autocommitandCOMMITstatements:SET autocommit=0; ... SQL import statements ... COMMIT;If you use the mysqldump option
--opt, you get dump files that are fast to import into anInnoDBtable, even without wrapping them with theSET autocommitandCOMMITstatements.If you have
UNIQUEconstraints on secondary keys, you can speed up table imports by temporarily turning off the uniqueness checks during the import session:SET unique_checks=0; ... SQL import statements ... SET unique_checks=1;For big tables, this saves a lot of disk I/O because
InnoDBcan use its insert buffer to write secondary index records in a batch. Be certain that the data contains no duplicate keys.If you have
FOREIGN KEYconstraints in your tables, you can speed up table imports by turning the foreign key checks off for the duration of the import session:SET foreign_key_checks=0; ... SQL import statements ... SET foreign_key_checks=1;For big tables, this can save a lot of disk I/O.
IMO, the whole chapter is worth the read.