How download big file using PHP (low memory usage)

后端 未结 2 760
梦如初夏
梦如初夏 2020-12-02 19:34

I have to download big file (1xx MB) using PHP.

How can i download this without wasting memory (RAM) for temporary file ?

When i use

$someth         


        
相关标签:
2条回答
  • 2020-12-02 20:18

    you can shell out to a wget using exec() this will result in the lowest memory usage.

    <?php
     exec("wget -o outputfilename.tar.gz http://pathtofile/file.tar.gz")
    ?>
    

    You can also try using fopen() and fread() and fwrite(). That way you onlly download x bytes into memory at a time.

    0 讨论(0)
  • 2020-12-02 20:22

    Copy the file one small chunk at a time

    /**
     * Copy remote file over HTTP one small chunk at a time.
     *
     * @param $infile The full URL to the remote file
     * @param $outfile The path where to save the file
     */
    function copyfile_chunked($infile, $outfile) {
        $chunksize = 10 * (1024 * 1024); // 10 Megs
    
        /**
         * parse_url breaks a part a URL into it's parts, i.e. host, path,
         * query string, etc.
         */
        $parts = parse_url($infile);
        $i_handle = fsockopen($parts['host'], 80, $errstr, $errcode, 5);
        $o_handle = fopen($outfile, 'wb');
    
        if ($i_handle == false || $o_handle == false) {
            return false;
        }
    
        if (!empty($parts['query'])) {
            $parts['path'] .= '?' . $parts['query'];
        }
    
        /**
         * Send the request to the server for the file
         */
        $request = "GET {$parts['path']} HTTP/1.1\r\n";
        $request .= "Host: {$parts['host']}\r\n";
        $request .= "User-Agent: Mozilla/5.0\r\n";
        $request .= "Keep-Alive: 115\r\n";
        $request .= "Connection: keep-alive\r\n\r\n";
        fwrite($i_handle, $request);
    
        /**
         * Now read the headers from the remote server. We'll need
         * to get the content length.
         */
        $headers = array();
        while(!feof($i_handle)) {
            $line = fgets($i_handle);
            if ($line == "\r\n") break;
            $headers[] = $line;
        }
    
        /**
         * Look for the Content-Length header, and get the size
         * of the remote file.
         */
        $length = 0;
        foreach($headers as $header) {
            if (stripos($header, 'Content-Length:') === 0) {
                $length = (int)str_replace('Content-Length: ', '', $header);
                break;
            }
        }
    
        /**
         * Start reading in the remote file, and writing it to the
         * local file one chunk at a time.
         */
        $cnt = 0;
        while(!feof($i_handle)) {
            $buf = '';
            $buf = fread($i_handle, $chunksize);
            $bytes = fwrite($o_handle, $buf);
            if ($bytes == false) {
                return false;
            }
            $cnt += $bytes;
    
            /**
             * We're done reading when we've reached the conent length
             */
            if ($cnt >= $length) break;
        }
    
        fclose($i_handle);
        fclose($o_handle);
        return $cnt;
    }
    

    Adjust the $chunksize variable to your needs. This has only been mildly tested. It could easily break for a number of reasons.

    Usage:

    copyfile_chunked('http://somesite.com/somefile.jpg', '/local/path/somefile.jpg');
    
    0 讨论(0)
提交回复
热议问题