I have a Python script running as a daemon. At startup, it spawns 5 processes, each of which connects to a Postgres database. Now, in order to reduce the number of DB connec
You can't sanely share a DB connection across processes like that. You can sort-of share a connection between threads, but only if you make sure the connection is only used by one thread at a time. That won't work between processes because there's client-side state for the connection stored in the client's address space.
If you need large numbers of concurrent workers, but they're not using the DB all the time, you should have a group of database worker processes that handle all database access and exchange data with your other worker processes. Each database worker process has a DB connection. The other processes only talk to the database via your database workers.
Python's multiprocessing queues, fifos, etc offer appropriate messaging features for that.