parsing large xml 500M with node.js

妖精的绣舞 提交于 2019-12-20 11:35:16

问题


I am using isaacs' SAX to parse a huge xml file. Also recommended by La Gentz.

The process uses about 650M of memory, how can I reduce this or allow node to use even more.

FATAL ERROR: CALL_AND_RETRY_0 Allocation failed - process out of memory

My XML file is larger than 300M it could grow to 1GB.


回答1:


You should stream the file into the parser, that's the whole point of a streaming parser after all.

var parser = require('sax').createStream(strict, options);
fs.createReadStream(file).pipe(parser);


来源:https://stackoverflow.com/questions/8707255/parsing-large-xml-500m-with-node-js

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!