node.js-stream

How to wrap a buffer as a stream2 Readable stream?

给你一囗甜甜゛ 提交于 2019-12-27 17:12:01
问题 How can I transform a node.js buffer into a Readable stream following using the stream2 interface ? I already found this answer and the stream-buffers module but this module is based on the stream1 interface. 回答1: With streamifier you can convert strings and buffers to readable streams with the new stream api. 回答2: The easiest way is probably to create a new PassThrough stream instance, and simply push your data into it. When you pipe it to other streams, the data will be pulled out of the

How to wrap a buffer as a stream2 Readable stream?

我与影子孤独终老i 提交于 2019-12-27 17:11:52
问题 How can I transform a node.js buffer into a Readable stream following using the stream2 interface ? I already found this answer and the stream-buffers module but this module is based on the stream1 interface. 回答1: With streamifier you can convert strings and buffers to readable streams with the new stream api. 回答2: The easiest way is probably to create a new PassThrough stream instance, and simply push your data into it. When you pipe it to other streams, the data will be pulled out of the

What are the purposes of vinyl-buffer and gulp-streamify in gulp?

你。 提交于 2019-12-18 14:12:30
问题 As the documentation says, they both deal with transforming non-stream plugins to stream. What I try to understand is, if I can use the .pipe() method on something, doesn't it mean it's a stream? If so, what do I convert to what here? vinyl-source-stream example: (from: https://www.npmjs.com/package/vinyl-buffer) var browserify = require('browserify') var source = require('vinyl-source-stream') var buffer = require('vinyl-buffer') var uglify = require('gulp-uglify') var size = require('gulp

Node.js Piping the same readable stream into multiple (writable) targets

与世无争的帅哥 提交于 2019-12-17 02:43:11
问题 I need to run two commands in series that need to read data from the same stream. After piping a stream into another the buffer is emptied so i can't read data from that stream again so this doesn't work: var spawn = require('child_process').spawn; var fs = require('fs'); var request = require('request'); var inputStream = request('http://placehold.it/640x360'); var identify = spawn('identify',['-']); inputStream.pipe(identify.stdin); var chunks = []; identify.stdout.on('data',function(chunk)

Using nodejs server with request package and function pipe()?

匆匆过客 提交于 2019-12-12 20:40:11
问题 I'm using a nodejs server to mockup a backend at the moment. The server is a webserver and returns json objects on different requests, works flawlessy. Now I have to get the json objects from another domain, so I have to proxy the server. I have found a package called request in npm. I can get the simple example to work, but I have to forward the whole webpage. My code for the proxy looks like this: var $express = require('express'), $http = require('http'), $request = require('request'),

node.js createWriteStream doesn't create new file on Heroku

僤鯓⒐⒋嵵緔 提交于 2019-12-11 03:53:17
问题 I have following code that works fine on my localhost running node.js 0.12.0. The code creates a new file, and copy data from readable, but it doesn't create new file on Heroku. var output = fs.createWriteStream('public/images/test/testfile.png'); readable.pipe(output); I thought it has something to do with the permission, but whenever I change the permission on the folder using heroku run bash and then chmod -R 777 images/ Heroku resets it back to its original permission which is drwx------

Must I repeatedly call readable.read() within a readable event handler?

邮差的信 提交于 2019-12-10 17:24:16
问题 Suppose I have created a transform stream called Parser which can be written to like a normal stream but is read from as an object stream. I am using the readable event for the code that uses this transform stream: var parser = new Parser(); parser.on('readable', function () { var data = parser.read(); console.log(data); }); In this event handler, must I repeatedly call parser.read() ? Or, will readable fire on its own for every single object being pushed from my transform stream? 回答1:

Node.js net library: getting complete data from 'data' event

不问归期 提交于 2019-12-07 01:25:09
问题 I've searched around, and either can't find the exact question I'm trying to answer, or I need someone to explain it to me like I'm 5. Basically, I have a Node.js script using the Net library. I'm connecting to multiple hosts, and sending commands, and listening for return data. var net = require('net'); var nodes = [ 'HOST1,192.168.179.8', 'HOST2,192.168.179.9', 'HOST3,192.168.179.10', 'HOST4,192.168.179.11' ]; function connectToServer(tid, ip) { var conn = net.createConnection(23, ip); conn

How to ensure asynchronous code is executed after a stream is finished processing?

ぐ巨炮叔叔 提交于 2019-12-05 09:59:55
I have a stream that I process by listening for the data , error , and end events, and I call a function to process each data event in the first stream. Naturally, the function processing the data calls other callbacks, making it asynchronous. So how do I start executing more code when the data in the stream is processed? Listening for the end event in the stream does NOT mean the asynchronous data processing functions have finished. How can I ensure that the stream data processing functions are finished when I execute my next statement? Here is an example: function updateAccountStream

NodeJS parseStream, defining a start and end point for a chunk

ε祈祈猫儿з 提交于 2019-12-04 14:26:57
Confused by Node's filesystem parsing. Here's my code: var fs = require('fs'), xml2js = require('xml2js'); var parser = new xml2js.Parser(); var stream = fs.createReadStream('xml/bigXML.xml'); stream.setEncoding('utf8'); stream.on('data', function(chunk){ parser.parseString(chunk, function (err, result) { console.dir(result); console.log('Done'); }); }); stream.on('end', function(chunk){ // file have been read over,do something... console.log("IT'S OVER") }); This causes...nothing to happen. No output from XML2JS/the parser at all. When I try to console.log(chunk) it seems that the chunks aren