psycopg2

Python - Dynamic variable in Redshift SQL Query

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-08 07:52:35
问题 I'm developing in a python environment and I want to call an sql query using psycopg2 Lets say I have the following UNLOAD command in an .sql file: UNLOAD ( ' Some SQL Query ' ) TO 's3://%PATH%' ... In the sql file the %PATH% should be explicit like: 'folder1/folder3/file_name' . But I want the python program to set this %PATH% in runtime. which means that the .sql file containts something like %PATH% and will be set only in runtime. Any idea of how to do it? 回答1: You simply specify a

Psycopg2 & Flask - tying connection to before_request & teardown_appcontext

大憨熊 提交于 2019-12-08 06:38:10
问题 Cheers guys, refactoring my Flask app I got stuck at tying the db connection to @app.before_request and closing it at @app.teardown_appcontext . I am using plain Psycopg2 and the app factory pattern. First I created a function to call wihtin the app factory so I could use @app as suggested by Miguel Grinberg here: def create_app(test_config=None): app = Flask(__name__, instance_relative_config=True) -- from shop.db import connect_and_close_db connect_and_close_db(app) -- return app Then I

Inserting rows into db from a list of tuples using cursor.mogrify gives error

核能气质少年 提交于 2019-12-08 06:19:03
问题 I am trying to insert large number of rows into postgres using cursor.mogrify using this psycopg2: insert multiple rows with one query data is a list of tuples where each tuple is a row that needs to be inserted. cursor = conn.cursor() args_str = ','.join(cursor.mogrify("(%s,%s,%s,%s,%s,%s,%s,%s)", x) for x in data) cursor.execute( "insert into table1 (n, p, r, c, date, p1, a, id) values " + args_str)` but getting error : TypeError: sequence item 0: expected str instance, bytes found at line:

Psycopg / Postgres : Connections hang out randomly

不羁岁月 提交于 2019-12-08 02:41:39
问题 I'm using psycopg2 for the cherrypy app I'm currently working on and cli & phpgadmin to handle some operations manually. Here's the python code : #One connection per thread cherrypy.thread_data.pgconn = psycopg2.connect("...") ... #Later, an object is created by a thread : class dbobj(object): def __init__(self): self.connection=cherrypy.thread_data.pgconn self.curs=self.connection.cursor(cursor_factory=psycopg2.extras.DictCursor) ... #Then, try: blabla self.curs.execute(...) self.connection

Bulk update PostgreSQL from Python dict

自作多情 提交于 2019-12-08 02:20:02
问题 In an existing PostgreSQL table, I would like to UPDATE several existing columns with values from a dictionary lookup (see dict below). Somewhat like described in this nice blog post. However, I can't figure out how to do that with a Python dictionary. Here comes the terrible pseudo-code: d = {10:'chair', 11:'table', 12:'lamp', 20:'english ivy', 21:'peace lily', 22:'spider plant'} curs.execute(""" UPDATE my_table t SET furniture = %(t.furniture)s, SET plant = %(t.plant)s""", d) The original

Connection Error while connecting to PostgreSQL as postgres user?

牧云@^-^@ 提交于 2019-12-08 02:05:06
问题 I am not able connect to PostgreSQL remotely using python and psycopg2: Here is my code. >>> import psycopg2 >>> conn_string = "host='localhost' dbname='mydb' user='postgres'" >>> print "Connecting to database\n ->%s" % (conn_string) Connecting to database ->host='localhost' dbname='mydb' user='postgres' >>> conn = psycopg2.connect(conn_string) Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/home/tools/lib/python2.7/site-packages/psycopg2/__init__.py", line 164,

psycopg2: Will PostgreSQL store a copy of a table on disk if it has run out of memory

烈酒焚心 提交于 2019-12-08 01:04:23
问题 I am running the following query on 489 million rows (102 gb) on a computer with 2 gb of memory: select * from table order by x, y, z, h, j, l; I am using psycopg2 with a server cursor ("cursor_unique_name") and fetch 30000 rows at a time. Obviously the result of the query cannot stay in memory, but my question is whether the following set of queries would be just as fast: select * into temp_table from table order by x, y, z, h, j, l; select * from temp_table This means that I would use a

Convert Python list of dicts into Postgresql array of json

谁说我不能喝 提交于 2019-12-08 00:53:22
问题 I am trying to insert a Python(2.7) list of jsonb elements into a Postgresql(9.4) table with a column of datatype: jsonb[]. Here's some code: import json anArray = [{"name":"Joe","age":51,"yob":1964,"gender":"male"},{"name":"George","age":41,"dob":1974,"gender":"male"},{"name":"Nick","age":31,"dob":1984,"gender":"male"}] myArray = [] #here's what I have so far: for e in anArray: myArray.append(json.dumps(e)) #this gives me myArray = ['{"name":"Joe","age":51,"yob":1964,"gender":"male"}','{

Unable to connect with psycopg2, but can via command line

戏子无情 提交于 2019-12-07 23:59:51
问题 I'm attempting to connect to a remote Postgres database (in this case a Heroku Postgres instance). I have a Fabric command that does some work against the DB using psycopg2 . My Postgres connection occurs as so: # conf is a hash derived from the heroku config DATABASE_URL self.conn = psycopg2.connect( database=conf.get('database'), # supplied by heroku user=conf.get('username'), # is the database username supplied by heroku password=conf.get('password'), # supplied by heroku host=conf.get(

Error when importing CSV to postgres with python and psycopg2

偶尔善良 提交于 2019-12-07 17:27:52
问题 I try to COPY a CSV file from a folder to a postgres table using python and psycopg2 and I get the following error: Traceback (most recent call last): File "<stdin>", line 1, in <module> psycopg2.ProgrammingError: must be superuser to COPY to or from a file HINT: Anyone can COPY to stdout or from stdin. psql's \copy command also works for anyone. I also tried to run it through the python environment as: constr = "dbname='db_name' user='user' host='localhost' password='pass'" conn = psycopg2