问题
My use case:
I have multiple scripts inserting into a table in the order of several inserts per second. I am seeing performance degradation, so I think there would be performance benefits in "batching queries" and inserting several hundred rows every minute or so.
Question:
How would I go about doing this using mysqli? My current code uses a wrapper (pastebin), and looks like:
$array = array();\\BIG ARRAY OF VALUES (more than 100k rows worth)
foreach($array AS $key => $value){
$db -> q('INSERT INTO `player_items_attributes` (`column1`, `column2`, `column3`) VALUES (?, ?, ?)', 'iii', $value['test1'], $value['test2'], $value['test3']);
}
Notes:
I looked at using transactions, but it sounds like those would still hit the server, instead of queuing. I would prefer to use a wrapper (feel free to suggest one with similar functionality to what my current one offers), but if not possible I will try to build suggestions into the wrapper I use.
Sources:
Wrapper came from here
Edit: I am trying optimize table speed, rather than script speed. This table has more than 35million rows, and has a few indexes.
回答1:
The MySQL INSERT syntax allows for one INSERT query to insert multiple rows, like this:
INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);
where each set of parenthesised values represents another row in your table. So, by working down the array you could create multiple rows in one query.
There is one major limitation: the total size of the query must not exceed the configured limit. For 100k rows you'd probably have to break this down into blocks of, say, 250 rows, reducing your 100k queries to 400. You might be able to go further.
I'm not going to attempt to code this - you'd have to code something and try it in your environment.
Here's a pseudo-code version:
escape entire array // array_walk(), real_escape_string()
block_size = 250; // number of rows to insert per query
current_block = 0;
rows_array = [];
while (next-element <= number of rows) {
create parenthesised set and push to rows_array // implode()
if (current_block == block_size) {
implode rows_array and append to query
execute query
set current_block = 0
reset rows_array
reset query
}
current_block++
next_element++
}
if (there are any records left over) {
implode rows_array and append to query
execute the query for the last block
}
I can already think of a potentially faster implementation with array_map() - try it.
来源:https://stackoverflow.com/questions/17539742/recommendation-for-mysqli-batch-queries