问题
I'm writing a node.js application which stdout is piped to a file. I'm writing everything with console.log. After a while my Application reaches the 1GB Limit and stops. The interesting thing is, that if I use console.error instead of console.log, the memory usage keeps low and the programm runs fine. So it looks like node.js can't flush the stdout stream and everything is kept in memory. I wanna keep stderr free for errors.
My Question is:
Is there a way to write blocking into stdout? Or at least, can I write with a callback to stdout, so I can ensure I'm writing not too much?
thx!
回答1:
If you really really want synchronous writes to stdout you can do:
var fs = require('fs');
fs.writeSync(1, "Foo\n");
fs.fsyncSync(1);
回答2:
Write using process.stdout.write, the return value is whether data got buffered. If it's true, continue writing when process.stdout
emits the drain
event.
If you want your code to look sync, use streamlinejs as described here: Node.js stdout flush
回答3:
Don't.
What you want to do is pause()
your input when the output is full, like the pump()
method does, then resume()
it when you've got space to write. If not, your process balloons out to gargantuan size.
You probably want to use the more direct outputStream
stuff for that, though, or the write()
call, not console.log()
.
回答4:
A synchronous print function that also works with pipes aka FIFO's, using Async/await. Make sure you always call "print" with "await print"
let printResolver;
process.stdout.on('drain', function () {
if (printResolver) printResolver();
});
async function print(str) {
var done = process.stdout.write(str);
if (!done) {
await new Promise(function (resolve) {
printResolver = resolve;
});
}
}
来源:https://stackoverflow.com/questions/6471004/how-can-i-write-blocking-in-stdout-with-node-js