How to force download of big files without using too much memory?

自古美人都是妖i 提交于 2019-11-30 12:59:57
King Skippus

There are some ideas over in this thread. I don't know if the readfile() method will save memory, but it sounds promising.

You're sending the contents ($data) of this file via PHP?

If so, each Apache process handling this will end up growing to the size of this file, as that data will be cached.

Your ONLY solution is to not send file contents/data via PHP and simply redirect the user to a download URL on the filesystem.

Use a generated and unique symlink, or a hidden location.

You can't use $data with whole file data inside it. Try pass to this function not the content of file only it's path. Next send all headers once and after that read part of this file using fread(), echo that chunk, call flush() and repeat. If any other header will be send in the meantime then finally transfer will be corrupted.

Symlink the big file to your document root (assuming its not an authorized only file), then let Apache handle it. (That way you can accept byte ranges as well)

Emile Swarts

Add your ini_set before SESSION_START();

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!