Celery daemon - how to configure it to run multiple tasks from multiple Flask applications?

流过昼夜 提交于 2020-02-23 07:24:49

问题


I have a flask app myapp_A that uses celery to run some asynchronous tasks. And I have configured celery to run as a daemon process. Here is the service script.

/etc/default/celery:

# Name of nodes to start
CELERYD_NODES="w1"

# Absolute or relative path to the 'celery' command:
CELERY_BIN="/var/www/myapp_A.com/public_html/venv/bin/celery"

# App instance to use
CELERY_APP="myapp_A.celery"

# Where to chdir at start.
CELERYD_CHDIR="/var/www/myapp_A.com/public_html/"

# Extra command-line arguments to the worker
CELERYD_OPTS="--time-limit=300 --concurrency=8"

# %n will be replaced with the first part of the nodename.
CELERYD_LOG_FILE="/var/log/celery/%n%I.log"
CELERYD_PID_FILE="/var/run/celery/%n.pid"
CELERYD_LOG_LEVEL="INFO"

# Workers should run as an unprivileged user.
CELERYD_USER="myuser"
CELERYD_GROUP="www-data"

# If enabled pid and log directories will be created if missing,
# and owned by the userid/group configured.
CELERY_CREATE_DIRS=1

/etc/init.d/celeryd:

Celery's generic one from here.


Now I have another Flask app myapp_B that requires celery to run tasks as well.

  • How should I configure for this?
  • Should I create another daemon process under a different name?
  • How should I configure my message broker (RabbitMQ) for multiple celery process?

回答1:


You can use a single daemon process to work with both applications. one way is to to use different queue names for different applications Here is the config I'm using

celery worker -A init_celery --quiet --loglevel=$WORKER_LOG_LEVEL --concurrency=4 --queues=que1,que2

Then specify the queue name in each application. using

CELERY_DEFAULT_QUEUE = 'que1'


来源:https://stackoverflow.com/questions/59466401/celery-daemon-how-to-configure-it-to-run-multiple-tasks-from-multiple-flask-ap

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!