Why does my node.js app get “hung up” when I send more then 50000 queries simultaneously?

て烟熏妆下的殇ゞ 提交于 2020-01-04 09:32:30

问题


Moving to other question

Here is final question on this topic. I removed Express from the equationand generally cleared the question.


Before you say something like "fool, do them in series" or something like this, I want to mention, that this code will never be used on production. I googled for few days and ran tests, and so on and now I need some help.

If you wish to see some higher level question - here it is.

Test rig:

  • Windows 10
  • Mongo 3.2.11
  • Node 6.11.3
  • Express 4.15.4

Problem:

I have 100000 elements, which I want to check against some collection with find/findOne/whatever.

While I am doing this:

Mongo is fine - checking with logs, mongostat and simply connecting to it and querying somethings.

Node is partially fine - checking with some delayed console.log works fine, but if I try to open my app using browser (that is why I included express framework in tags section) it's loading until I ran into server timeout.

There is most simple setting for the page, without any connection to database or anything like it:

router.get('/',function(req, res) {
    res.send({result:"OK"});
});

And it hangs. I even manually catch server timeout to check this.

The question is why? Everything is asynchronous, all the "resources" are fine and working. What is the problem?

P.S. Mongostats show interesting things - first it shows adding 2000-3000 queries per line, then 0, 0, 0 ... again few thousands and then 0 again and again. Here is a paste


Here is my code, if you will want to test exactly like I do

    var t = setTimeout(function () {
        console.log("Timeout fired in 30 seconds");
    }, 30 * 1000);


    var testArtists = [];
    for (i = 0; i < 100000; i++) {
        testArtists[i] = Math.floor(Math.random() * 16777215).toString(16);
    }

    async.map(testArtists, function (artist, callback) {

        db.get().collection('someCollection').
            findOne(
            { "unimportant": testArtists[0] },
            function (err, discogsArtist) {
                if (err) return callback(err);
                return callback(null, "OK");
            }
            );
    }, function (err, results) {
        if (err) return console.log(err);
        return console.log("somehow finished");
    });

And here are some errors I got once, maybe they can help, somehow:

{ MongoError: connection 4 to localhost:27017 timed out
    at Function.MongoError.create (E:\blablabla\node_modules\mongodb-core\lib\error.js:29:11)
    at Socket.<anonymous> (E:\blablabla\node_modules\mongodb-core\lib\connection\connection.js:198:20)
    at Socket.g (events.js:292:16)
    at emitNone (events.js:86:13)
    at Socket.emit (events.js:185:7)
    at Socket._onTimeout (net.js:338:8)
    at ontimeout (timers.js:386:11)
    at tryOnTimeout (timers.js:250:5)
    at Timer.listOnTimeout (timers.js:214:5)
  name: 'MongoError',
  message: 'connection 4 to localhost:27017 timed out' }
Something went wrong when retrieving an access token read ECONNRESET
{ Error: read ECONNRESET
    at exports._errnoException (util.js:1020:11)
    at TLSWrap.onread (net.js:568:26) code: 'ECONNRESET', errno: 'ECONNRESET', syscall: 'read' }
{ Error: read ECONNRESET
    at exports._errnoException (util.js:1020:11)
    at TLSWrap.onread (net.js:568:26) code: 'ECONNRESET', errno: 'ECONNRESET', syscall: 'read' }

Thanks in advance!


UPDATE 1

Updated my code to insert objects - now we can see that some random number of objects are inserted (60000-80000) and then everything goes silent.


UPDATE 2

Looked at Express logs in realtime. There is no call at all, so express seems to be out of question. For now.

UPDATE 3

After making the pool to 1000 connections everything now works super quickly and finish in time. But this is just scaling solution, like "buy more RAM". I want to know why asynchronous use of database prevents serving pages from browser. There is distinct connection between MongoClient pool size and Node serving pages. When mongo driver takes all his allocated pool in, Node stop serving the pages. Why? And how to fight it?

来源:https://stackoverflow.com/questions/46490800/why-does-my-node-js-app-get-hung-up-when-i-send-more-then-50000-queries-simult

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!