psycopg2

psql - how to flush database content without dropping table

纵然是瞬间 提交于 2019-12-10 13:46:37
问题 I have a table in my db called 'mytable'. I'd like to clear it so that I can continue to collect and analyze 'fresh data' from it. Something like conn = psycopg2.connect(database = mydb_name, host = mydb_server, user = mydb_uname, password = mydb_pwd) cur = conn.cursor() cur.execute("DROP TABLE mytable;") Isn't going to work for me, because as far as I understand it, this destroys the table. I don't want to destroy/re-create... Just to flush all data. How can I work this out? 回答1: Truncate

Use psycopg2 to construct queries without connection

我们两清 提交于 2019-12-10 13:17:40
问题 I have several occasions where I want to collect data when in the field. This is in situations where I do not always have access to my postgres database. To keep things in sync, it would be excellent if I could use psycopg2 functions offline to generate queries that can be held back and once I am able to connect to the database; process everything that is held back. One thing I am currently struggling with is that the psycopg2 cursor requires a connection to be constructed. My question is: Is

Which Postgres value should I use in Django's DATABASE_ENGINE?

一曲冷凌霜 提交于 2019-12-10 12:28:21
问题 It's my first time using PostgreSQL 8.4.2 with Django(I have always used MySQL or sqlite3 in the past). Which value should I use for DATABASE_ENGINE in settings.py , postgresql_psycopg2 or postgresql ? How do they differ from each other? 回答1: Update for Django 1.9 The django.db.backends.postgresql_psycopg2 backend has been renamed to django.db.backends.postgresql in Django 1.9. (The psycopg2 name can still be used for backwards compatibility.) Essentially, for Django ≥1.9, use django.db

python + psycopg2 = unknown types?

≡放荡痞女 提交于 2019-12-10 10:15:35
问题 It seems when I use callproc(), psycopg2 isn't properly casting strings as text or character varying. For example: values = [pid, 4, 4, 'bureau ama', 0, 130, row['report_dte'], row['report_dte'], 1, 1, 1, None, None, 'published', row['report_dte']] cur.callproc('header', values) Yields: psycopg2.ProgrammingError: function header(integer, integer, integer, unknown, integer, integer, unknown, unknown, integer, integer, integer, unknown, unknown, unknown, unknown) does not exist LINE 1: SELECT *

How can I use server-side cursors with django and psycopg2?

别说谁变了你拦得住时间么 提交于 2019-12-10 03:21:06
问题 I'm trying to use a server-side curser in psycop2 as detailed in this blog post. In essence, this is achieved with from django.db import connection if connection.connection is None: cursor = connection.cursor() # This is required to populate the connection object properly cursor = connection.connection.cursor(name='gigantic_cursor') When I execute the query: cursor.execute('SELECT * FROM %s WHERE foreign_id=%s' % (table_name, id)) I get a ProgrammingError : psycopg2.ProgrammingError: can't

pgbouncer - closing because: unclean server on every connection

纵然是瞬间 提交于 2019-12-09 15:13:53
问题 I'm running Django 1.3 with PostgreSQL 9.1/PostGIS 1.5, psycopg2 2.4.2 and pgbouncer 1.4.2. On every single connection to the database I get a log entry in pgbouncer.log: 2011-11-20 02:15:25.027 29538 LOG S-0x96c2200: app_db/postgres@192.168.171.185:5432 closing because: unclean server (age=0). I can't find any solution to this problem - anybody have an idea why? I've tried reconfiguring pgbouncer (session/transaction mode, different timeouts etc), but to no avail. 回答1: Ok, I think I've

postgresql: out of shared memory?

橙三吉。 提交于 2019-12-09 00:50:26
问题 I'm running a bunch of queries using Python and psycopg2. I create one large temporary table w/ about 2 million rows, then I get 1000 rows at a time from it by using cur.fetchmany(1000) and run more extensive queries involving those rows. The extensive queries are self-sufficient, though - once they are done, I don't need their results anymore when I move on to the next 1000. However, about 1000000 rows in, I got an exception from psycopg2: psycopg2.OperationalError: out of shared memory HINT

How do I use Psycopg2's LoggingConnection?

二次信任 提交于 2019-12-08 16:15:48
问题 I'd like to log the queries that psycopg2 is making, but the psycopg2 documentation doesn't really specify how LoggingConnection should be used. import logging from psycopg2.extras import LoggingConnection db_settings = { "user": "abcd", "password": "efgh", "host": "postgres.db", "database": "dev", } conn = LoggingConnection(**db_settings) Gives an error LoggingConnection(**db_settings) TypeError: function takes at most 2 arguments (5 given) 回答1: Seems like setting the connection_factory

psycopg2 fails on execute many statement with syntax error

北战南征 提交于 2019-12-08 12:37:13
问题 I have data coming from mongodb which looks like data = ( { u'name': 'A', u'primary_key': 1 }, { u'name': 'B', u'primary_key': 2 }, { u'name': 'C', u'primary_key': 3 } ) when I call the following cur = conn.cursor() cur.executemany("""INSERT INTO ddmension(id,name) VALUES (%(primary_key)s, %(name)s)""", data) it fails saying ProgrammingError: 'syntax error at or near """"INSERT INTO dimension (id, name) VALUES (1, E\'A\')""""\nLINE 1: """INSERT INTO dimension (id, name) VALUES (1, E\'A\n ^\n'

Importing JSON data file into PostgreSQL using Python and Psycopg2

a 夏天 提交于 2019-12-08 07:54:57
问题 I am having trouble getting my query to work. I have a JSON file with over 80k lines of data. Since I have been having so many problems I cut the document down to three lines just to see if I can get the data in before I attempt the full 80k lines: Import psycopg2 import io readTest1 = io.open("C:\Users\Samuel\Dropbox\Work\Python and Postgres\test1.json", encoding = "utf-8") readAll = readTest1.readlines() I have seen online that using readlines is not the best method but it is the only