node-postgres with massive amount of queries

后端 未结 2 1663
粉色の甜心
粉色の甜心 2020-12-30 10:25

I just started playing around with node.js with postgres, using node-postgres. One of the things I tried to do is to write a short js to populate my database, using a file w

2条回答
  •  青春惊慌失措
    2020-12-30 11:14

    I'm guessing that you are reaching max pool size. Since client.query is asynchronous, prolly all the available connections are used before they are returned.

    Default Pool size is 10. Check here: https://github.com/brianc/node-postgres/blob/master/lib/defaults.js#L27

    You can increase default pool size by setting pg.defaults.poolSize:

    pg.defaults.poolSize = 20;
    

    Update: Execute another query after freeing a connection.

    var pg = require('pg');
    var connectionString = "postgres://xxxx:xxxx@localhost/xxxx";
    var MAX_POOL_SIZE = 25;
    
    pg.defaults.poolSize = MAX_POOL_SIZE;
    pg.connect(connectionString, function(err,client,done){
      if(err) {
        return console.error('could not connect to postgres', err);
      }
    
      var release = function() {
        done();
        i++;
        if(i < 1000000)
          insertQ();
      };
    
      var insertQ = function() {
        client.query("INSERT INTO testDB VALUES (" + i.toString() + "," + (1000000-i).toString() + "," + (-i).toString() + ")",        function(err,result){
          if (err) {
             return console.error('Error inserting query', err);
          }
          release();
        });
      };
    
      client.query("DROP TABLE IF EXISTS testDB");
      client.query("CREATE TABLE IF NOT EXISTS testDB (id int, first int,    second int)");
      done();
    
      for (i = 0; i < MAX_POOL_SIZE; i++){
        insertQ();
      }
    });
    

    The basic idea is since you are enqueuing a large number of queries with relatively small connection pool size, you are reaching max pool size. Here we make new query only after an existing connection is freed.

提交回复
热议问题