How can I automatically reload tasks modules with Celery daemon?

陌路散爱 提交于 2019-12-05 11:40:11

问题


I am using Fabric to deploy a Celery broker (running RabbitMQ) and multiple Celery workers with celeryd daemonized through supervisor. I cannot for the life of me figure out how to reload the tasks.py module short of rebooting the servers.


/etc/supervisor/conf.d/celeryd.conf

[program:celeryd]
directory=/fab-mrv/celeryd
environment=[RABBITMQ crendentials here]
command=xvfb-run celeryd --loglevel=INFO --autoreload
autostart=true
autorestart=true

celeryconfig.py

import os

## Broker settings
BROKER_URL = "amqp://%s:%s@hostname" % (os.environ["RMQU"], os.environ["RMQP"])

# List of modules to import when celery starts.
CELERY_IMPORTS = ("tasks", )

## Using the database to store task state and results.
CELERY_RESULT_BACKEND = "amqp"

CELERYD_POOL_RESTARTS = True

Additional information

  • celery --version 3.0.19 (Chiastic Slide)
  • python --version 2.7.3
  • lsb_release -a Ubuntu 12.04.2 LTS
  • rabbitmqctl status ... 2.7.1 ...

Here are some things I have tried:

  • The celeryd --autoreload flag
  • sudo supervisorctl restart celeryd
  • celery.control.broadcast('pool_restart', arguments={'reload': True})
  • ps auxww | grep celeryd | grep -v grep | awk '{print $2}' | xargs kill -HUP

And unfortunately, nothing causes the workers to reload the tasks.py module (e.g. after running git pull to update the file). The gist of the relevant fab functions is available here.

The brokers/workers run fine after a reboot.


回答1:


Just a shot in the dark, with the celeryd --autoreload option did you make sure you have one of the file system notification backends? It recommends PyNotify for linux, so I'd start by making sure you have that installed.



来源:https://stackoverflow.com/questions/16929264/how-can-i-automatically-reload-tasks-modules-with-celery-daemon

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!