A good persistent synchronous queue in python

假如想象 提交于 2019-12-21 15:19:13

问题


I don't immediately care about fifo or filo options, but it might be nice in the future..

What I'm looking for a is a nice fast simple way to store (at most a gig of data or tens of millions of entries) on disk that can be get and put by multiple processes. The entries are just simple 40 byte strings, not python objects. Don't really need all the functionality of shelve.

I've seen this http://code.activestate.com/lists/python-list/310105/ It looks simple. It needs to be upgraded to the new Queue version.

Wondering if there's something better? I'm concerned that in the event of a power interruption, the entire pickled file becomes corrupt instead of just one record.


回答1:


Try using Celery. It's not pure python, as it uses RabbitMQ as a backend, but it's reliable, persistent and distributed, and, all in all, far better then using files or database in the long run.




回答2:


I think that PyBSDDB is what you want. You can choose a queue as the access type. PyBSDDB is a Python module based on Oracle Berkeley DB. It has synchronous access and can be accessed from different processes although I don't know if that is possible from the Python bindings. About multiple processes writing to the db I found this thread.




回答3:


Using files is not working?...

Use a journaling file system to recover from power interruptions. That's their purpose.



来源:https://stackoverflow.com/questions/8682249/a-good-persistent-synchronous-queue-in-python

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!