I am writing a node.js app on Heroku and using the pg module. I can\'t figure out the \"right\" way to get a client object for each request that I need to query the database
I'm the author of node-postgres. First, I apologize the documentation has failed to make the right option clear: that's my fault. I'll try to improve it. I wrote a Gist just now to explain this because the conversation grew too long for Twitter.
Using
pg.connectis the way to go in a web environment.PostgreSQL server can only handle 1 query at a time per connection. That means if you have 1 global
new pg.Client()connected to your backend your entire app is bottleknecked based on how fast postgres can respond to queries. It literally will line everything up, queuing each query. Yeah, it's async and so that's alright...but wouldn't you rather multiply your throughput by 10x? Usepg.connectset thepg.defaults.poolSizeto something sane (we do 25-100, not sure the right number yet).
new pg.Clientis for when you know what you're doing. When you need a single long lived client for some reason or need to very carefully control the life-cycle. A good example of this is when usingLISTEN/NOTIFY. The listening client needs to be around and connected and not shared so it can properly handleNOTIFYmessages. Other example would be when opening up a 1-off client to kill some hung stuff or in command line scripts.
One very helpful thing is to centralize all access to your database in your app to one file. Don't litter pg.connect calls or new clients throughout. Have a file like db.js that looks something like this:
module.exports = {
query: function(text, values, cb) {
pg.connect(function(err, client, done) {
client.query(text, values, function(err, result) {
done();
cb(err, result);
})
});
}
}
This way you can change out your implementation from pg.connect to a custom pool of clients or whatever and only have to change things in one place.
Have a look at the node-pg-query module that does just this.