psycopg2

Is it required to close a Psycopg2 connection at the end of a script?

寵の児 提交于 2019-12-06 18:40:39
问题 What are the consequences of not closing a psycopg2 connection at the end of a Python script? For example, consider the following snippet: import psycopg2 psycopg2.connect("dbname=test") The script opens a connection, but does not close it at the end. Is the connection still open at the end of the execution? If so, is there an issue with not closing the connection? 回答1: Normally when your python program exits, all the sockets it owns will be closed, and open transactions aborts. But it's good

Python - Dynamic variable in Redshift SQL Query

蓝咒 提交于 2019-12-06 15:08:12
I'm developing in a python environment and I want to call an sql query using psycopg2 Lets say I have the following UNLOAD command in an .sql file: UNLOAD ( ' Some SQL Query ' ) TO 's3://%PATH%' ... In the sql file the %PATH% should be explicit like: 'folder1/folder3/file_name' . But I want the python program to set this %PATH% in runtime. which means that the .sql file containts something like %PATH% and will be set only in runtime. Any idea of how to do it? You simply specify a replacement field in your SQL file, and the use a format command. Create your file like this UNLOAD ('Some SQL

Bulk update PostgreSQL from Python dict

南笙酒味 提交于 2019-12-06 13:40:24
In an existing PostgreSQL table, I would like to UPDATE several existing columns with values from a dictionary lookup (see dict below). Somewhat like described in this nice blog post . However, I can't figure out how to do that with a Python dictionary. Here comes the terrible pseudo-code: d = {10:'chair', 11:'table', 12:'lamp', 20:'english ivy', 21:'peace lily', 22:'spider plant'} curs.execute(""" UPDATE my_table t SET furniture = %(t.furniture)s, SET plant = %(t.plant)s""", d) The original table would look somewhat like this: gid | furniture | plant ----------------------- 0 | 10 | 21 1 | 11

psycopg2: Will PostgreSQL store a copy of a table on disk if it has run out of memory

柔情痞子 提交于 2019-12-06 08:33:07
I am running the following query on 489 million rows (102 gb) on a computer with 2 gb of memory: select * from table order by x, y, z, h, j, l; I am using psycopg2 with a server cursor ("cursor_unique_name") and fetch 30000 rows at a time. Obviously the result of the query cannot stay in memory, but my question is whether the following set of queries would be just as fast: select * into temp_table from table order by x, y, z, h, j, l; select * from temp_table This means that I would use a temp_table to store the ordered result and fetch data from that table instead. The reason for asking this

Installing psycopg2 in virtualenv (Ubuntu 10.04, Python 2.5)

前提是你 提交于 2019-12-06 06:51:37
问题 I had problems installing psycopg2 in a virtualenv. I tried different things explained there: http://www.saltycrane.com/blog/2009/07/using-psycopg2-virtualenv-ubuntu-jaunty/ The last thing I tried is this... I created a virtualenv with -p python2.5 --no-site-packages I installed libpq-dev: apt-get install libpq-dev In the virtualenv, I did this: easy_install -i http://downloads.egenix.com/python/index/ucs4/ egenix-mx-base Then when I tried pip install psycopg2==2.0.7 , I got this error:

Connection Error while connecting to PostgreSQL as postgres user?

送分小仙女□ 提交于 2019-12-06 06:25:20
I am not able connect to PostgreSQL remotely using python and psycopg2: Here is my code. >>> import psycopg2 >>> conn_string = "host='localhost' dbname='mydb' user='postgres'" >>> print "Connecting to database\n ->%s" % (conn_string) Connecting to database ->host='localhost' dbname='mydb' user='postgres' >>> conn = psycopg2.connect(conn_string) Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/home/tools/lib/python2.7/site-packages/psycopg2/__init__.py", line 164, in connect conn = _connect(dsn, connection_factory=connection_factory, async=async) psycopg2

Convert Python list of dicts into Postgresql array of json

佐手、 提交于 2019-12-06 06:10:51
I am trying to insert a Python(2.7) list of jsonb elements into a Postgresql(9.4) table with a column of datatype: jsonb[]. Here's some code: import json anArray = [{"name":"Joe","age":51,"yob":1964,"gender":"male"},{"name":"George","age":41,"dob":1974,"gender":"male"},{"name":"Nick","age":31,"dob":1984,"gender":"male"}] myArray = [] #here's what I have so far: for e in anArray: myArray.append(json.dumps(e)) #this gives me myArray = ['{"name":"Joe","age":51,"yob":1964,"gender":"male"}','{"name":"George","age":41,"dob":1974,"gender":"male"}','{"name":"Nick","age":31,"dob":1984,"gender":"male"}'

Psycopg / Postgres : Connections hang out randomly

断了今生、忘了曾经 提交于 2019-12-06 05:34:52
I'm using psycopg2 for the cherrypy app I'm currently working on and cli & phpgadmin to handle some operations manually. Here's the python code : #One connection per thread cherrypy.thread_data.pgconn = psycopg2.connect("...") ... #Later, an object is created by a thread : class dbobj(object): def __init__(self): self.connection=cherrypy.thread_data.pgconn self.curs=self.connection.cursor(cursor_factory=psycopg2.extras.DictCursor) ... #Then, try: blabla self.curs.execute(...) self.connection.commit() except: self.connection.rollback() lalala ... #Finally, the destructor is called : def __del__

Postgres - python multiple SSL connections

风格不统一 提交于 2019-12-06 05:34:31
问题 I have troubles establishing two concurrent Postgres databases connections (one to master, one to slave) using psycopg2 and SSL. Separately, both connection work ie: import psycopg2 dsnMaster='dbname=... sslcert=path/to/master/cert' psycopg2.connect(dsnMaster, connection_factory=None, async=False) works and so does import psycopg2 dsnSlave='dbname=... sslcert=path/to/slave/cert' psycopg2.connect(dsnSlave, connection_factory=None, async=False But joining both import psycopg2 dsnMaster='dbname=

Django can't drop database: psycopg2.OperationalError: cannot drop the currently open database

大城市里の小女人 提交于 2019-12-06 02:06:16
问题 Whenever I try to run my Django tests via manage.py, the tests run fine however at the end when Django is destroying the database the following error occurs: Destroying test database for alias 'default'... Traceback (most recent call last): File "/Users/dcgoss/Desktop/Pickle/PickleBackend/venv/lib/python3.4/site-packages/django/db/backends/utils.py", line 62, in execute return self.cursor.execute(sql) psycopg2.OperationalError: cannot drop the currently open database The above exception was