Out of memory error when reading a large file

不羁的心 提交于 2020-01-21 06:57:24

问题


I have a large csv that I want to parse and insert into my database. I have this PHP:

$target = '../uploads/'.$f;
$handle = fopen($target, "r");
$data = fgetcsv($handle, 0, ",");

$rows = array();

while ($data !== FALSE) {
    $rows[] =  $data;
}

fclose($handle);

if (count($rows)) {
             foreach ($rows as $key => $value) {

                  echo $value;

              }
          }

Every time I try to run my script I get this error:

Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 35 bytes)

Any ideas how to do this?


回答1:


I think this part is wrong:

$data = fgetcsv($handle, 0, ",");
$rows = array();
while ($data !== FALSE) {
    $rows[] =  $data;
}

One call to fgetcsv reads one line from $handle. You need to put fgetcsv in the loop condition:

$handle = fopen($target, "r");
$data = fgetcsv($handle, 0, ",");
while (($row = fgetcsv($handle, 0, ",")) !== FALSE) {
    // Example insert - obviously use prepared statements/escaping/another DAL
    $db->query("INSERT INTO tbl (columns..) VALUES ({$row[0]}, {$row[1]} ... )");
}



回答2:


Use mysqlimport instead

While you can certainly parse and build queries with PHP you'll get much better performance by letting MySQL handle it directly. Your database will thank you.

<?php
exec("mysqlimport mysqlimport [options] db_name textfile1");
?>

Sources:

  • http://www.php.net/manual/en/function.exec.php
  • http://dev.mysql.com/doc/refman/5.0/en/mysqlimport.html



回答3:


Use standard MySQL LOAD DATA INFILE statement to escape reading/parsing/inserting data via PHP:

 function import_csv( $table, $afields, $filename, $delim = ',', $enclosed = '"', $escaped = '\\',  $lineend = '\\r\\n', $hasheader = FALSE) {      
    if ( $hasheader ) $ignore = "IGNORE 1 LINES ";
    else $ignore = "";
    $q_import = 
        "LOAD DATA INFILE '" . $_SERVER['DOCUMENT_ROOT'] . $filename . "' INTO TABLE " . $table . " " .
        "FIELDS TERMINATED BY '" . $delim . "' ENCLOSED BY '" . $enclosed . "' " .
        "    ESCAPED BY '" . $escaped . "' " .
        "LINES TERMINATED BY '" . $lineend . "' " . $ignore . "(" . implode(',', $afields) . ")"
    ;
    return mysql_query($q_import);
}

In this case you don't need to open/read CSV file in PHP, MySQL will handle data import by itself.




回答4:


You don't need to read all csv data from the file into memory before processing.

Instead, create a while loop that reads from your file one line at a time. Each time you read from the file, you should insert a row into your database.

Or read several lines and insert several rows at one time.

Example:

   $i = 0; 
   while (($data = fgetcsv($handle, 0, ",") !== FALSE) {
        $rows[] = $data;
        $i++;
        // insert 100 rows at one time.
        if ($i % 100 === 0) {
            //insert these rows to db
            insert_to_db($rows);
            //then reset $rows
            $rows = array();
        }
    }
    insert_to_db($rows);


来源:https://stackoverflow.com/questions/8441469/out-of-memory-error-when-reading-a-large-file

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!