Read through huge text files and store each line in database

后端 未结 2 1445
深忆病人
深忆病人 2020-12-06 15:39

I wrote a PHP script which runs through a text file (actually it\'s a \'list\' file from imdb) and stores it on my local MySQL database.

public static functi         


        
相关标签:
2条回答
  • 2020-12-06 16:29

    You may either: Use set_time_limit(sec) or (better) run your script from the command line through a cron entry. That way you will avoid many other non-php related timeout issues.

    0 讨论(0)
  • 2020-12-06 16:29

    I don't think its a goo idea for you to load such kind of large file directly to your database ... especially when it takes so long to conclude

    My Advice

    Split the files to smaller chunks locally .. then on the remote server upload it to your database

    Example (Documentation : http://en.wikipedia.org/wiki/Split_%28Unix%29 )

     exec('split -d -b 2048m ' . $list . ' chunks');
    

    For a pure PHP implementation see

    http://www.php.happycodings.com/File_Manipulation/code50.html

    Or

    define('CHUNK_SIZE', 1024*1024);
    function readfile_chunked($filename, $retbytes = TRUE) {
        $buffer = '';
        $cnt =0;
        // $handle = fopen($filename, 'rb');
        $handle = fopen($filename, 'rb');
        if ($handle === false) {
          return false;
        }
        while (!feof($handle)) {
          $buffer = fread($handle, CHUNK_SIZE);
          echo $buffer;
          ob_flush();
          flush();
          if ($retbytes) {
            $cnt += strlen($buffer);
          }
        }
        $status = fclose($handle);
        if ($retbytes && $status) {
          return $cnt; // return num. bytes delivered like readfile() does.
        }
        return $status;
      }
    
    0 讨论(0)
提交回复
热议问题