Handling large data in php

烂漫一生 提交于 2019-12-12 03:39:09

问题


I am trying to insert large data into db after mls property search(PHRETS) . The result object have around 4500 to 5000 records each having 450-475 keys, it gives "HTTP Error 500 Internal server error" while inserting data into db after sometime(generally after 6-7 min) because of time limit of server I guess, I asked server guys to increase the time limit for execution, still it gives error

here is the process of my execution

1)I make search into mls for properties

2)I try to insert all records at once using Implode to save execution time

$qry=mysqli_query($link,"INSERT INTO `rets_property_res` VALUES implode(',', $sql)");

-tried using prepared statements

can we store this data somewhere and then later process it at once OR can we speed up the process to make everything work in given timeframe


回答1:


The fastest way to accomplish this is to grab a large chunk (many thousands of rows), blast it out to a CSV, and perform a LOAD DATA INFILE. Some environments lack the ability to use the LOAD DATA LOCAL INFILE .... LOCAL part due to server settings. The link above has another one to this that states:

When loading a table from a text file, use LOAD DATA INFILE. This is usually 20 times faster than using INSERT statements. See Section 14.2.6, “LOAD DATA INFILE Syntax”.

I have found that to be easily true.

Certainly slower but better than individual inserts for speed is to put multiple inserts together with one statement:

insert myTable (col1,col2) values ('a1','b1'), ('a2','b2'), ('a3','b3');

So in the above example, 3 rows are inserted with one statement. Typically for speed it is best to play around with 500 to 1000 rows at a time (not 3). That all depends on your string size, based on your schema, for that insert statement.

Security concerns: you need to be keen to the possibility of 2nd level sql injection attacks, as far fetched as that may seem. But it is possible.

All of the above may seem trivial, but for an anecdotal example and UX pain comment I offer the following. I have a c# application that grabs questions off of Stackoverflow and houses their metrics. Not the body part, or answers, but the title and many counts and datetimes. I insert 1000 questions a time into my db (or do an insert ignore or an insert on duplicate key update). Before converting that to LOAD DATA INFILE it would take about 50 seconds per 1000 to do c#/mysql bindings with a re-used Prepared Statement. After converting it to LOAD DATA INFILE (including the truncate of worktable, the csv write, and the insert statement), it takes about 1.5 seconds per 1000 rows.




回答2:


Generally you won't be able to insert 5000 rows at once due to the max packet size limit of MySQL (depends on how much data there is in each insert).

Try inserting them in smaller groups, e.g. 100 at once or less.



来源:https://stackoverflow.com/questions/37897221/handling-large-data-in-php

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!