How to convert stream into buffer in nodejs? Here is my code to parse a file in post request in express.
app.post(\'/upload\', express.multipart({
defer: tr
Instead of piping, you can attach data and end event handlers to the part stream to read it:
var buffers = [];
part.on('data', function(buffer) {
buffers.push(buffer);
});
part.on('end', function() {
var buffer = Buffer.concat(buffers);
...do your stuff...
// write to file:
fs.writeFile('image/' + part.filename, buffer, function(err) {
// handle error, return response, etc...
});
});
However, this will read the entire upload into memory. If that's an issue, you might want to consider creating a custom transform stream to transform the incoming data, but that might not be trivial.
You can use the stream-to module, which can convert a readable stream's data into an array or a buffer:
var streamTo = require('stream-to');
req.form.on('part', function (part) {
streamTo.buffer(part, function (err, buffer) {
// Insert your business logic here
});
});
If you want a better understanding of what's happening behind the scenes, you can implement the logic yourself, using a Writable stream. As a writable stream implementor, you only have to define one function: the _write method, that will be called every time some data is written to the stream. When the input stream is finished emitting data, the end event will be emitted: we'll then create a buffer using the Buffer.concat method.
var stream = require('stream');
var converter = new stream.Writable();
converter.data = []; // We'll store all the data inside this array
converter._write = function (chunk) {
this.data.push(chunk);
};
converter.on('end', function() { // Will be emitted when the input stream has ended, ie. no more data will be provided
var b = Buffer.concat(this.data); // Create a buffer from all the received chunks
// Insert your business logic here
});