I use Postgresql 9.4 for a model database. My table looks somewhat like this:
CREATE TABLE table1 (
sid INTEGER PRIMARY KEY NOT NULL DEFAULT nextval(\'table1
My workaround is to slice putback
with a simple function as proposed here:
def chunk(l, n):
n = max(1, n)
return [l[i:i + n] for i in range(0, len(l), n)]
and then
for chunk in chunk(putback, 250000):
curs.execute("UPDATE table1
SET col3 = p.result
FROM unnest(%s) p(sid INT, result JSONB)
WHERE sid = p.sid", (chunk,))
This works, i.e. keeps the memory footprint in check, but is not very elegant and slower than dumping all data at once, as I usually do.