Following on from this question, when I try to create a postgresql table from a dask.dataframe with more than one partition I get the following error:
Integr
I had the same error with ponyORM on PostgreSQL in Heroku. I solved it by locking the thread until it executes the DB operation. In my case:
lock = threading.Lock()
with lock:
PonyOrmEntity(name='my_name', description='description')
PonyOrmEntity.get(lambda u: u.name == 'another_name')
I was reading this. It seems this error rises when you are creating/updating the same table with parallel processing. I understand it depends because of this (as explained on the google group discussion).
So I think it depend from PostgreSQL
itself and not from the connection driver or the module used for the multiprocessing.
Well, Actually, the only way I found to solve this is to create chunks big enough to have back a writing process slower than the calculation itself. With bigger chunks this error doesn't rise.