Symfony : Doctrine data fixture : how to handle large csv file?

前端 未结 3 1813
刺人心
刺人心 2021-02-20 12:35

I am trying to insert (in a mySQL database) datas from a \"large\" CSV file (3Mo / 37000 lines / 7 columns) using doctrine data fixtures.

The process is very slow and at

3条回答
  •  野趣味
    野趣味 (楼主)
    2021-02-20 13:01

    For fixtures which need lots of memory but don't depend on each other, I get around this problem by using the append flag to insert one entity (or smaller group of entities) at a time:

    bin/console doctrine:fixtures:load --fixtures="memory_hungry_fixture.file" --append
    

    Then I write a Bash script which runs that command as many times as I need.

    In your case, you could extend the Fixtures command and have a flag which does batches of entities - the first 1000 rows, then the 2nd 1000, etc.

提交回复
热议问题