Many users of my site have reported problems downloading a large file (80 MB). I am using a forced download using headers. I can provide additional php settings if necessary
The problem with PHP-initiated http transfers is that they seldomly support partial requests:
GET /yourfile HTTP/1.1
Range: bytes=31489531-79837582
Whenever a browser encounters a transmission problem, it will try to resume the download. Your php script does not accomodate for that (it's not trivial, so nobody does).
So really avoid that. Redirect users to a static file and let your webserver handle it. If you need to handle authorization, use tricks like symlinks or rewriterules that check for session cookies or even a static permission file (./allowed/178.224.2.55-file-1
). Any required extra HTTP headers can be injected likewise, or with a .meta
file.
I don't see any trouble, but for S&G's try placing the set_time_limit inside the while loop. This ensures they don't hit a hard limit and (as long as the client's taking the information) the time-limit gets extended.