Share memory areas between celery workers on one machine

十年热恋 提交于 2019-12-21 12:06:30

问题


I want to share small pieces of informations between my worker nodes (for example cached authorization tokens, statistics, ...) in celery.

If I create a global inside my tasks-file it's unique per worker (My workers are processes and have a life-time of 1 task/execution).

What is the best practice? Should I save the state externally (DB), create an old-fashioned shared memory (could be difficult because of the different pool implementations in celery)?

Thanks in advance!


回答1:


I finally found a decent solution - core python multiprocessing-Manager:

from multiprocessing import Manager
manag = Manager()
serviceLock = manag.Lock()
serviceStatusDict = manag.dict()

This dict can be accessed from every process, it's synchronized, but you have to use a lock when accessing it concurrently (like in every other shared memory implementation).



来源:https://stackoverflow.com/questions/9565542/share-memory-areas-between-celery-workers-on-one-machine

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!