How to keep multiple independent celery queues?

后端 未结 2 1524
不思量自难忘°
不思量自难忘° 2020-12-07 20:34

I\'m trying to keep multiple celery queues with different tasks and workers in the same redis database. Really just a convenience issue of only wanting one redis server rath

相关标签:
2条回答
  • 2020-12-07 21:18

    By default everything goes into a default queue named celery (and this is what celery worker will process if no queue is specified)

    So say you have your do_work task function in django_project_root/myapp/tasks.py.

    You could configure the do_work task to live in it's own queue like so:

    CELERY_ROUTES = {
        'myproject.tasks.do_work': {'queue': 'red'},
    }
    

    Then run a worker using celery worker -Q red and it will only process things in that queue (another worker invoked with celery worker will only pickup things in the default queue)

    The task routing section in the documentation should explain all.

    0 讨论(0)
  • 2020-12-07 21:33

    To link to different queue dynamically, follow the below steps:

    1) Specify the name of the queue with the 'queue' attribute

    celery.send_task('job1', args=[], kwargs={}, queue='queue_name_1')
    celery.send_task('job1', args=[], kwargs={}, queue='queue_name_2')
    

    (Here a particular job uses two queues)

    2) Add the following entry in the configuration file

    CELERY_CREATE_MISSING_QUEUES = True
    

    3) While starting the worker, use -Q to specify the queue name' from which the jobs to be consumed

    celery -A proj worker -l info -Q queue1 
    celery -A proj worker -l info -Q queue2
    
    0 讨论(0)
提交回复
热议问题