Share a numpy array in gunicorn processes

我是研究僧i 提交于 2019-11-30 23:37:49

问题


I have a big numpy array that is stored in redis. This array acts as an index. I want to serve filtered result over http from a flask app running on gunicorn and I want all the workers spawned by gunicorn to have access to this numpy array. I don't want to go to redis every time and deserialize the entire array in memory, instead on startup I want to run some code that does this and every forked worker of gunicorn just gets a copy of this array. The problem is, I can not find any examples on how to use gunicorn's server hooks: http://docs.gunicorn.org/en/latest/configure.html#server-hooks to achieve this. May be server hooks is not the right way of doing it, has anyone else done something similar?


回答1:


Create an instance of a Listener server and have your gunicorn children connect to that process to fetch whatever data they need as Clients. This way the processes can modify the information as needed and request it from the main process instead of going to Redis to reload the entire dataset.

More info here: Multiprocessing - 16.6.2.10. Listeners and Clients.



来源:https://stackoverflow.com/questions/16366124/share-a-numpy-array-in-gunicorn-processes

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!