I wrote this code that allows to read a folder, depending on the file name is inserted in a different table the data of the csv file. Once the file is processed it is moved
Instead of inserting data into database for every row, try inserting in batches.
You can always do a bulk insert, that can take n(use 1000) number of entries and insert it into the table.
https://www.mysqltutorial.org/mysql-insert-multiple-rows/
This will result in reduction of the DB calls, thereby reducing the overall time.
And for 80k entries there is a possibility that you might exceed the memory limit too.
You can overcome that using generators in php. https://medium.com/@aashish.gaba097/database-seeding-with-large-files-in-laravel-be5b2aceaa0b
Although, this is in Laravel, but the code that reads from csv is independent (the one that uses generator) and the logic can be used here.