NodeJS Copying File over a stream is very slow

后端 未结 1 1722
小鲜肉
小鲜肉 2020-12-28 21:54

I am copying file with Node on an SSD under VMWare, but the performance is very low. The benchmark I have run to measure actual speed is as follows:

$ hdparm         


        
相关标签:
1条回答
  • 2020-12-28 22:21

    I don't know the answer to your question, but perhaps this helps in your investigation of the problem.

    In the Node.js documentation about stream buffering, it says:

    Both Writable and Readable streams will store data in an internal buffer that can be retrieved using writable.writableBuffer or readable.readableBuffer, respectively.

    The amount of data potentially buffered depends on the highWaterMark option passed into the stream's constructor. For normal streams, the highWaterMark option specifies a total number of bytes. For streams operating in object mode, the highWaterMark specifies a total number of objects....

    A key goal of the stream API, particularly the stream.pipe() method, is to limit the buffering of data to acceptable levels such that sources and destinations of differing speeds will not overwhelm the available memory.

    Source: http://www.nodejs.org/api/stream.html#stream_buffering

    So, you can play with the buffer sizes to improve speed:

    var fs = require('fs');
    var path = require('path');
    var from = path.normalize(process.argv[2]);
    var to = path.normalize(process.argv[3]);
    
    var readOpts = {highWaterMark: Math.pow(2,16)};  // 65536
    var writeOpts = {highWaterMark: Math.pow(2,16)}; // 65536  
    
    var source = fs.createReadStream(from, readOpts);
    var destiny = fs.createWriteStream(to, writeOpts)
    
    source.pipe(destiny);
    

    https://nodejs.org/api/stream.html#stream_writable_writablehighwatermark

    https://nodejs.org/api/stream.html#stream_readable_readablehighwatermark

    https://nodejs.org/api/fs.html#fs_fs_createreadstream_path_options

    0 讨论(0)
提交回复
热议问题