In Node.js I\'m using the fs.createWriteStream
method to append data to a local file. In the Node documentation they mention the drain
event when u
Imagine you're connecting 2 streams with very different bandwidths, say, uploading a local file to a slow server. The (fast) file stream will emit data faster than the (slow) socket stream can consume it.
In this situation, node.js will keep data in memory until the slow stream gets a chance to process it. This can get problematic if the file is very large.
To avoid this, Stream.write
returns false
when the underlying system buffer is full. If you stop writing, the stream will later emit a drain
event to indicate that the system buffer has emptied and it is appropriate to write again.
You can use pause/resume
the readable stream and control the bandwidth of the readable stream.
Better: you can use readable.pipe(writable)
which will do this for you.
EDIT: There's a bug in your code: regardless of what write
returns, your data has been written. You don't need to retry it. In your case, you're writing data
twice.
Something like this would work:
var packets = […],
current = -1;
function niceWrite() {
current += 1;
if (current === packets.length)
return stream.end();
var nextPacket = packets[current],
canContinue = stream.write(nextPacket);
// wait until stream drains to continue
if (!canContinue)
stream.once('drain', niceWrite);
else
niceWrite();
}