How to read big file in php without being memory limit

前端 未结 4 1660
青春惊慌失措
青春惊慌失措 2021-01-14 05:04

I\'m trying to read a file line by line. The problem is the file was too big(over 500000 line) and I reach out the memory limit. I wonder how to read the file without being

4条回答
  •  难免孤独
    2021-01-14 05:38

    PHP cleans memory best when a scope is cleared in my experience. A loop doesn't count as a scope, but a function does.
    So handing your file pointer to a function, doing your database things within the function and then exiting the function for the loop, where you can call gc_collect_cycles() should help with managing your memory and to force php to clean up after itself.

    I also recommend turning off echo, but rather log to a file. You can then use a command tail -f filename to read that log output(windows linux subsystem, git for windows bash, or on linux)

    I use a similar method to below to handle large files with millions of entries, and it helps with staying under the memory limit.

    function dostuff($fn) 
    {
        $result = fgets($fn);
        // store database, do transforms, whatever
        echo $result;
    }
    
    $fn = fopen("myfile.txt", "r");
    
    while(!feof($fn)) {
        dostuff($fn);
        flush(); // only need this if you do the echo thing.
        gc_collect_cycles();
    }
    
    fclose($fn);
    

提交回复
热议问题