Why don't large files download easily in Laravel?

和自甴很熟 提交于 2019-11-28 08:44:58
Phill Sparks

This happens because Response::download() loads the file in to memory before serving it to the user. Admittedly this is a flaw in the framework, but most people do not try to serve large files through the framework.

Solution 1 - Put the files you want to download in the public folder, on a static domain, or cdn - bypass Laravel completely.

Understandably, you might be trying to restrict access to your downloads by login, in which case you'll need to craft your own download method, something like this should work...

function sendFile($path, $name = null, array $headers = array())
{
    if (is_null($name)) $name = basename($path);

    // Prepare the headers
    $headers = array_merge(array(
        'Content-Description'       => 'File Transfer',
        'Content-Type'              => File::mime(File::extension($path)),
        'Content-Transfer-Encoding' => 'binary',
        'Expires'                   => 0,
        'Cache-Control'             => 'must-revalidate, post-check=0, pre-check=0',
        'Pragma'                    => 'public',
        'Content-Length'            => File::size($path),
    ), $headers);

    $response = new Response('', 200, $headers);
    $response->header('Content-Disposition', $response->disposition($name));

    // If there's a session we should save it now
    if (Config::get('session.driver') !== '')
    {
        Session::save();
    }

    // Send the headers and the file
    ob_end_clean();
    $response->send_headers();

    if ($fp = fread($path, 'rb')) {
        while(!feof($fp) and (connection_status()==0)) {
            print(fread($fp, 8192));
            flush();
        }
    }

    // Finish off, like Laravel would
    Event::fire('laravel.done', array($response));
    $response->foundation->finish();

    exit;
}

This function is a combination of Response::download() and Laravel's shutdown process. I've not had a chance to test it myself, I don't have Laravel 3 installed at work. Please let me know if it does the job for you.

PS: The only thing this script does not take care of is cookies. Unfortunately the Response::cookies() function is protected. If this becomes a problem you can lift the code from the function and put it in your sendFile method.

PPS: There might be an issue with output buffering; if it is a problem have a look in the PHP manual at readfile() examples, there's a method that should work there.

PPPS: Since you're working with binary files you might want to consider replacing readfile() with fpassthru()

EDIT: Disregard PPS and PPPS, I've updated the code to use fread+print instead as this seems more stable.

You can use the Symfony\Component\HttpFoundation\StreamedResponse like this:

$response = new StreamedResponse(
    function() use ($filePath, $fileName) {
        // Open output stream
        if ($file = fopen($filePath, 'rb')) {
            while(!feof($file) and (connection_status()==0)) {
                print(fread($file, 1024*8));
                flush();
            }
            fclose($file);
        }
    },
    200,
    [
        'Content-Type' => 'application/octet-stream',
        'Content-Disposition' => 'attachment; filename="' . $fileName . '"',
    ]);

return $response;

for more information check this

I'm using the readfile_chunked() custom method as stated in php.net here. For Laravel 3, I've extended the response method like this:

Add this file as applications/libraries/response.php

<?php
class Response extends Laravel\Response {

    //http://www.php.net/manual/en/function.readfile.php#54295
    public static function readfile_chunked($filename,$retbytes=true) { 
       $chunksize = 1*(1024*1024); // how many bytes per chunk 
       $buffer = ''; 
       $cnt =0; 
       // $handle = fopen($filename, 'rb'); 
       $handle = fopen($filename, 'rb'); 
       if ($handle === false) { 
           return false; 
       } 
       while (!feof($handle)) { 
           $buffer = fread($handle, $chunksize); 
           echo $buffer; 
           ob_flush(); 
           flush(); 
           if ($retbytes) { 
               $cnt += strlen($buffer); 
           } 
       } 
           $status = fclose($handle); 
       if ($retbytes && $status) { 
           return $cnt; // return num. bytes delivered like readfile() does. 
       } 
       return $status; 

    } 
}

Then comment out this line in application/config/application.php:

'Response'      => 'Laravel\\Response',

Example code:

//return Response::download(Config::get('myconfig.files_folder').$file->upload, $file->title);

header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.$file->title);
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . File::size(Config::get('myconfig.files_folder').$file->upload));
ob_clean();
flush();
Response::readfile_chunked(Config::get('myconfig.files_folder').$file->upload);
exit;

Works great so far.

Kees Sonnema

You need to increase your memory_limit like so:

You need to increase memory_limit using the ini_set function e.g ini_set('memory_limit','128M');

This will work for you i hope!

i found the answer here: https://stackoverflow.com/a/12443644/1379394

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!