Importing JSON data file into PostgreSQL using Python and Psycopg2

a 夏天 提交于 2019-12-08 07:54:57

问题


I am having trouble getting my query to work. I have a JSON file with over 80k lines of data. Since I have been having so many problems I cut the document down to three lines just to see if I can get the data in before I attempt the full 80k lines:

Import psycopg2
import io
readTest1 = io.open("C:\Users\Samuel\Dropbox\Work\Python and Postgres\test1.json", encoding = "utf-8")
readAll = readTest1.readlines()

I have seen online that using readlines is not the best method but it is the only method i know. This method read the three lines in the file. I am not sure but I expected this to make it an array also.

conn = psycopg2.connect("dbname = python_trial user = postgres")
cur = conn.cursor()
cur.execute("CREATE TABLE test4 (data json);")

Create a table that only takes JSON data:

cur.executemany("INSERT INTO test4 VALUES (%s)", readAll)

The error:

Traceback (most recent call last):
File "<pyshell#13>", line 1, in <module>
cur.executemany("INSERT INTO test4 VALUES (%s)", readAll)
TypeError: not all arguments converted during string formatting

I am not exactly sure what I am doing incorrectly. I am also seeing "\n" when i print (readAll). I think that is caused by using the readlines method and I am not sure if that is messing up my query also.


回答1:


Use this:

cur.executemany("INSERT INTO test4 VALUES ('{0}')".format(readAll))

Or (readAll,): cur.executemany("INSERT INTO test4 VALUES (%s)", (readAll,))

See: http://initd.org/psycopg/docs/usage.html#passing-parameters-to-sql-queries



来源:https://stackoverflow.com/questions/37667867/importing-json-data-file-into-postgresql-using-python-and-psycopg2

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!