问题
I'm trying to serve large zip files to users. When there are 2 concurrent connections, the server runs out of memory (RAM). I increased the amount of memory from 300MB to 4GB (Dreamhost VPS) and then it worked fine.
I need to allow a lot more than 2 concurrent connections. The actual 4GB would allow something like 20 concurrent connections (too bad).
Well, the current code I'm using, needs the double of memory then the actual file size. That's too bad. I want something like "streaming" the file to user. So I would allocate not more than the chunk being served to users.
The following code is the one I'm using in CodeIgniter (PHP framework):
ini_set('memory_limit', '300M'); // it was the maximum amount of memory from my server
set_time_limit(0); // to avoid the connection being terminated by the server when serving bad connection downloads
force_download("download.zip", file_get_contents("../downloads/big_file_80M.zip"));exit;
The force_download function is as follows (CodeIgniter default helper function):
function force_download($filename = '', $data = '')
{
if ($filename == '' OR $data == '')
{
return FALSE;
}
// Try to determine if the filename includes a file extension.
// We need it in order to set the MIME type
if (FALSE === strpos($filename, '.'))
{
return FALSE;
}
// Grab the file extension
$x = explode('.', $filename);
$extension = end($x);
// Load the mime types
@include(APPPATH.'config/mimes'.EXT);
// Set a default mime if we can't find it
if ( ! isset($mimes[$extension]))
{
$mime = 'application/octet-stream';
}
else
{
$mime = (is_array($mimes[$extension])) ? $mimes[$extension][0] : $mimes[$extension];
}
// Generate the server headers
if (strpos($_SERVER['HTTP_USER_AGENT'], "MSIE") !== FALSE)
{
header('Content-Type: "'.$mime.'"');
header('Content-Disposition: attachment; filename="'.$filename.'"');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header("Content-Transfer-Encoding: binary");
header('Pragma: public');
header("Content-Length: ".strlen($data));
}
else
{
header('Content-Type: "'.$mime.'"');
header('Content-Disposition: attachment; filename="'.$filename.'"');
header("Content-Transfer-Encoding: binary");
header('Expires: 0');
header('Pragma: no-cache');
header("Content-Length: ".strlen($data));
}
exit($data);
}
I tried some chunk based codes that I found in Google, but the file always was delivered corrupted. Probably because of bad code.
Could anyone help me?
回答1:
There are some ideas over in this thread. I don't know if the readfile() method will save memory, but it sounds promising.
回答2:
You're sending the contents ($data) of this file via PHP?
If so, each Apache process handling this will end up growing to the size of this file, as that data will be cached.
Your ONLY solution is to not send file contents/data via PHP and simply redirect the user to a download URL on the filesystem.
Use a generated and unique symlink, or a hidden location.
回答3:
You can't use $data with whole file data inside it. Try pass to this function not the content of file only it's path. Next send all headers once and after that read part of this file using fread(), echo that chunk, call flush() and repeat. If any other header will be send in the meantime then finally transfer will be corrupted.
回答4:
Symlink the big file to your document root (assuming its not an authorized only file), then let Apache handle it. (That way you can accept byte ranges as well)
回答5:
Add your ini_set
before SESSION_START();
来源:https://stackoverflow.com/questions/6195550/how-to-force-download-of-big-files-without-using-too-much-memory