问题
I need your help. I want to create a upload script with HTML, JQuery and PHP. Is it possible to write a script, that can upload very large files(> 5 GB)?
I've try it with FileReader, FormData and Blobs, but even with these, i can't upload large files(my browser crashes after selecting a large file).
PS: I want to write it myself. Don't post any finished scripts.
Regards
回答1:
Yes. I wrote PHP to upload file exactly 5GB more then one year ago.
FileReader, FormData and Blobs will fail because they all require pre-process and convert in javascript before it get upload.
But, you can easily upload large file with plain simple XMLHttpRequest.
var xhr=new XMLHttpRequest();
xhr.send(document.forms[0]['fileinput']);
It is not a standard or documented way, however, few Chrome and Firefox do support. However, it send file content as is, not multipart/form-data, not http form-data. You will need to prepare your own http header to provide additional information.
var xhr=new XMLHttpRequest(), fileInput=document.forms[0]['fileinput'];
xhr.setRequestHeader("X-File-Name", encodeURIComponent(getInputFileName(fileInput)));
xhr.setRequestHeader("X-File-Size", getFileSize(fileInput));
xhr.send(fileInput);
PS. well, actually it was not PHP. It was mixed PHP and Java Servlet.
回答2:
The problem is, that its not really practical. First you have the problem, that you have to restart your upload when you have a browser problem. And this could happen when you upload a big file.
Here is another solution with Ajax:
php uploading large files
AX-JQuery Uploader
回答3:
Look into "chunking", possibly with a plugin like AX Ajax multi uploader, which should help with both client and server-side file size limits.
回答4:
Keep in mind that it is important to adjust your PHP.ini variable (which is related to script timing), called Maximum execution time of each script, in seconds max_execution_time = xxxx
in order to prevent your script from time out, because uploading large files is as you know time consuming. Check also variable max_input_time = xxxx
, which is maximum amount of time each script may spend parsing request data. It's a good idea to limit this time on productions servers in order to eliminate unexpectedly long running scripts, but in your case you may need to increase it.
Consider also changing following variables memory_limit
, upload_max_filesize
, post_max_size
回答5:
I dont think web uploads were thought for 5gb+ kind of file, or that the browser is going to transfer this kind of information happily. File system limitation are also an issue. You should think/rethink the file upload depending on the usage scenario. Is the web only option? FTP, streaming, remote dumping are probably better solution that will not block your webserver/webpage while doing the transfer. HTTP is not the best protocol for this.
Think that the browser, PHP and Apache they all have limited memory. My antivirus warns me when chrome uses more than 250 MB per page (which is not considered normal). PHP has a default 128 MB of dedicated memory, and imagine having 100 simultaneous Apache users uploading 5GB files. That is why they invented FTP.
Why do you think those limits exists in PHP, apache...? Because is a way of attack, a security issue and a way of blocking the server which can be easily exploited by ... everybody.
来源:https://stackoverflow.com/questions/13122218/upload-very-large-files5gb