I am trying to upload files through my web app using the following code.
View:
The FormData API encodes data in base64 which add 33% extra overhead.
Instead of sending FormData
, send the file directly:
app.service('fileUpload', function ($http) {
this.uploadFileToUrl = function (url, file) {
̶v̶a̶r̶ ̶f̶d̶ ̶=̶ ̶n̶e̶w̶ ̶F̶o̶r̶m̶D̶a̶t̶a̶(̶)̶;̶
̶f̶d̶.̶a̶p̶p̶e̶n̶d̶(̶'̶f̶i̶l̶e̶'̶,̶ ̶f̶i̶l̶e̶)̶;̶
̶r̶e̶t̶u̶r̶n̶ ̶$̶h̶t̶t̶p̶.̶p̶o̶s̶t̶(̶u̶r̶l̶,̶ ̶f̶d̶,̶ ̶{̶
return $http.post(url, file, {
transformRequest: angular.identity,
headers: { 'Content-Type': undefined }
});
};
});
When the browser sends FormData
, it uses 'Content-Type': multipart/formdata
and encodes each part using base64.
When the browser sends a file (or blob), it sets the content type to the MIME-type of the file (or blob). It puts the binary data in the body of the request.
<input type="file">
to work with ng-model
2Out of the box, the ng-model
directive does not work with input type="file"
. It needs a directive:
app.directive("selectNgFile", function() {
return {
require: "ngModel",
link: function postLink(scope,elem,attrs,ngModel) {
elem.on("change", function(e) {
var files = elem[0].files[0];
ngModel.$setViewValue(files);
})
}
}
});
Usage:
<input type="file" select-ng-file ng-model="rsdCtrl.viewData.file" name="file"/>
I use this workaround...
HTML:
<input type="file" style="display:none" value="" id="uploadNewAttachment"/>
JavaScript:
In JavaScript you can upload files using the 3 method:
var binBlob = []; // If you use AngularJS, better leave it out of the DOM
var fi = document.getElementById('uploadNewAttachment');
fi.onchange = function(e) {
r = new FileReader();
r.onloadend = function(ev) {
binBlob[binBlob.length] = ev.target.result;
};
//r.readAsDataURL(e.target.files[0]); // Very slow due to Base64 encoding
//r.readAsBinaryString(e.target.files[0]); // Slow and may result in incompatible chars with AJAX and PHP side
r.readAsArrayBuffer(e.target.files[0]); // Fast and Furious!
};
$(fi).trigger('click');
What we have, javascript side is an Uint8Array of byte with values from 0 to 255 (or a Int8Array -128 to 127).
When this Array is sent via AJAX, it is "maximized" using signs and commas. This increases the number of total bytes sent.
EX:
[123, 38, 98, 240, 136, ...] or worse: [-123, 38, -81, 127, -127, ...]
As you can see, the number of characters transmitted is oversized.
We can instead proceed as follows:
Before send data over AJAX, do this:
var hexBlob = [];
for(var idx=0; idx<binBlob.length; idx++) {
var ex = Array.from(new Uint8Array(binBlob[idx]));;
for(var i=0;i<ex.length; i++) {
ex[i] = ex[i].toString(16).padStart(2,'0');
};
hexBlob[idx] = ex.join('');
}
What you have now, is a string of hex bytes in chars!
Ex:
3a05f4c9...
that use less chars of a signed or unsigned javascript array.
PHP: On the PHP side, you can decode this array, directly to binary data, simply using:
for($idx=0; $idx<=count($hexBlob); $idx++) {
// ...
$binData = pack('H*',$hexBlob[$idx]);
$bytesWritten = file_put_contents($path.'/'.$fileName[$idx], $binData);
//...
}
This solution worked very well for me.
I take the chance and assume you are using bodyParser
as middleware. bodyParser has a default limit
of 100kb. Look at node_modules/body-parser/lib/types/urlencoded.js
:
var limit = typeof options.limit !== 'number'
? bytes(options.limit || '100kb')
: options.limit
You can change the limit in your app.js
by
var bodyParser = require('body-parser');
...
app.use(bodyParser.urlencoded( { limit: 1048576 } )); //1mb