node.js-stream

How to consume MTOM SOAP web service in node.js?

最后都变了- 提交于 2020-07-15 08:27:06
问题 I need to download or process a file from a soap based web service in node.js. can someone suggest me on how to handle this in node.js I tried with 'node-soap' or 'soap' NPM module. it worked for normal soap web service. But, not for binary steam or MTOM based SOAP web service 回答1: I want to try to answer this... It's quite interesting that 2 years and 2 months later I can not figure it out how to easily solve the same problem. I'm trying to get the attachment from a response like: ...

Multiple Consumption of single stream

杀马特。学长 韩版系。学妹 提交于 2020-02-26 12:08:26
问题 I want to know if its possible that multiple functions can consume single stream in node.js. If yes How can this done? Is it possible to pipe to multiple destinations? I want to use the stream in two different functions which are parallel. I am doing the parallel flow using the async module. So will it possible to say issue the pipe() statement inside each of these functions? Thanks in advance. 回答1: Yes , it's possible, easy and common. The following is a piped data stream from a single

What is the difference between local and global module in Node.js? When to use local and global module?

﹥>﹥吖頭↗ 提交于 2020-02-18 07:21:10
问题 We can access local module using require function but cannot access global module through it. I read somewhere that to use global module we need to make it local then import it through require function. So if we cannot access global module directly, then what is the need of using it. 回答1: You should: Install a module locally if you're going to require() it. Install a module globally if you're going to run it on the command line. 回答2: I think in my opinion the modules which you are going to

What is the difference between local and global module in Node.js? When to use local and global module?

。_饼干妹妹 提交于 2020-02-18 07:20:49
问题 We can access local module using require function but cannot access global module through it. I read somewhere that to use global module we need to make it local then import it through require function. So if we cannot access global module directly, then what is the need of using it. 回答1: You should: Install a module locally if you're going to require() it. Install a module globally if you're going to run it on the command line. 回答2: I think in my opinion the modules which you are going to

Check if a stream is process.stdout

China☆狼群 提交于 2020-01-25 22:07:31
问题 Is there an elegant way to determine if a stream is process.stdout I am working with streams and will like to end the stream, but found out that if the stream is process.stdout an error is thrown because process.stdout is a special stream that cannot be closed. So want to end all streams except it's process.stdout I tried using a try and catch, but the process.stdout error ends the node process, ignoring the try and catch. 回答1: Perhaps this is naive of me, but I'd think you can just check

Node fs.readstream() outputs <Buffer 3c 3f 78 6d 6c …> instead of readable data [duplicate]

主宰稳场 提交于 2020-01-13 10:34:08
问题 This question already has an answer here : Why does console.log(buffer) give me a hexadecimal list? (1 answer) Closed 6 years ago . I'm reading a large XML file (~1.5gb) in Node Js. I'm trying to stream it and do something with chunks of data, but I'm finding it difficult to understand the documentation. My current simple code is: var fs = require('fs'); var stream = fs.createReadStream('xml/bigxmlfile.xml'); stream.on('data', function(chunk){ console.log(chunk) }); The console gives a bunch

How to ensure asynchronous code is executed after a stream is finished processing?

。_饼干妹妹 提交于 2020-01-13 10:16:29
问题 I have a stream that I process by listening for the data , error , and end events, and I call a function to process each data event in the first stream. Naturally, the function processing the data calls other callbacks, making it asynchronous. So how do I start executing more code when the data in the stream is processed? Listening for the end event in the stream does NOT mean the asynchronous data processing functions have finished. How can I ensure that the stream data processing functions

NodeJS parseStream, defining a start and end point for a chunk

∥☆過路亽.° 提交于 2020-01-13 04:53:08
问题 Confused by Node's filesystem parsing. Here's my code: var fs = require('fs'), xml2js = require('xml2js'); var parser = new xml2js.Parser(); var stream = fs.createReadStream('xml/bigXML.xml'); stream.setEncoding('utf8'); stream.on('data', function(chunk){ parser.parseString(chunk, function (err, result) { console.dir(result); console.log('Done'); }); }); stream.on('end', function(chunk){ // file have been read over,do something... console.log("IT'S OVER") }); This causes...nothing to happen.

untarring files to S3 fails, not sure why

a 夏天 提交于 2020-01-11 04:03:09
问题 (new information below) I am trying to set up a lambda function that reacts to uploaded tgz files by uncompressing them and writing the results back to S3. The unzip and untar work fine, but uploading to S3 fails: /Users/russell/lambda/gzip/node_modules/aws-sdk/lib/s3/managed_upload.js:350 var buf = self.body.read(self.partSize - self.partBuffer.length) || ^ TypeError: undefined is not a function at ManagedUpload.fillStream (/Users/russell/lambda/gzip/node_modules/aws-sdk/lib/s3/managed

node.js http.IncomingMessage does not fire 'close' event

我的梦境 提交于 2020-01-01 16:52:50
问题 When does the http.IncomingMessage fire its 'close' event? According to the documentation it should occur when the underlaying connection was closed. However, it is never called for the following example code (I made sure it is not caused by keep-alive): var http = require('http'), fs = require('fs'); var server = http.createServer(function(req, res) { res.shouldKeepAlive = false; req.on("end", function() { console.log("request end"); }); req.on("close", function() { console.log("request