问题
I have a large javascript object that I want to convert to JSON and write to a file. I thought I could do this using streams like so
var fs = require('fs');
var JSONStream = require('JSONStream');
var st = JSONStream.stringifyObject()
.pipe(fs.createWriteStream('./output_file.js'))
st.write(large_object);
When I try this I get an error:
stream.js:94
throw er; // Unhandled stream error in pipe.
^
TypeError: Invalid non-string/buffer chunk
at validChunk (_stream_writable.js:153:14)
at WriteStream.Writable.write (_stream_writable.js:182:12)
So apparently I cant just write an object to this stringifyObject
. I'm not sure what the next step is. I need to convert the object to a buffer? Run the object through some conversion stream and pipe it to strinigfyObject
回答1:
JSONStream doesn't work that way but since your large object is already loaded into memory there is no point to that.
var fs = require('fs-extra')
var file = '/tmp/this/path/does/not/exist/file.txt'
fs.outputJson(file, {name: 'JP'}, function (err) {
console.log(err) // => null
});
That will write the JSON.
If you want to use JSONStream you could do something like this:
var fs = require('fs');
var jsonStream = require('JSONStream');
var fl = fs.createWriteStream('dat.json');
var out = jsonStream.stringifyObject();
out.pipe(fl);
obj = { test:10, ok: true };
for (key in obj) out.write([key, obj[key]]);
out.end();
来源:https://stackoverflow.com/questions/32427890/how-to-use-streams-to-json-stringify-large-nested-objects-in-node-js