Node js - http.request() problems with connection pooling

落花浮王杯 提交于 2019-11-28 08:36:48

You have to consume the response.

Remember, in v0.10, we landed streams2. That means that data events don't happen until you start looking for them. So, you can do stuff like this:

http.createServer(function(req, res) {
  // this does some I/O, async
  // in 0.8, you'd lose data chunks, or even the 'end' event!
  lookUpSessionInDb(req, function(er, session) {
    if (er) {
      res.statusCode = 500;
      res.end("oopsie");
    } else {
      // no data lost
      req.on('data', handleUpload);
      // end event didn't fire while we were looking it up
      req.on('end', function() {
        res.end('ok, got your stuff');
      });
    }
  });
});

However, the flip side of streams that don't lose data when you're not reading it, is that they actually don't lose data if you're not reading it! That is, they start out paused, and you have to read them to get anything out.

So, what's happening in your test is that you're making a bunch of requests and not consuming the responses, and then eventually the socket gets killed by google because nothing is happening, and it assumes you've died.

There are some cases where it's impossible to consume the incoming message: that is, if you don't add a response event handler on a requests, or where you completely write and finish the response message on a server without ever reading the request. In those cases, we just dump the data in the garbage for you.

However, if you are listening to the 'response' event, it's your responsibility to handle the object. Add a response.resume() in your first example, and you'll see it processes on through at a reasonable pace.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!