How to improve the speed of insertion of the csv data in a database in php?

前端 未结 1 505
没有蜡笔的小新
没有蜡笔的小新 2020-12-07 05:49

I wrote this code that allows to read a folder, depending on the file name is inserted in a different table the data of the csv file. Once the file is processed it is moved

相关标签:
1条回答
  • 2020-12-07 06:35

    Instead of inserting data into database for every row, try inserting in batches.

    You can always do a bulk insert, that can take n(use 1000) number of entries and insert it into the table.

    https://www.mysqltutorial.org/mysql-insert-multiple-rows/

    This will result in reduction of the DB calls, thereby reducing the overall time.

    And for 80k entries there is a possibility that you might exceed the memory limit too.

    You can overcome that using generators in php. https://medium.com/@aashish.gaba097/database-seeding-with-large-files-in-laravel-be5b2aceaa0b

    Although, this is in Laravel, but the code that reads from csv is independent (the one that uses generator) and the logic can be used here.

    0 讨论(0)
提交回复
热议问题