celery how to implement single queue with multiple workers executing in parallel

血红的双手。 提交于 2019-12-10 23:48:47

问题


I am currently running celery 4.0.2 with a single worker like this:

celery.py:

app = Celery('project',
         broker='amqp://jimmy:jimmy123@localhost/jimmy_vhost',
         backend='rpc://',
         include=['project.tasks'])

if __name__ == '__main__':
    app.start()
    app.name

tasks.py:

from .celery import app
from celery.schedules import schedule
from time import sleep, strftime

app.conf.beat_schedule = {
    'planner_1': {
        'task': 'project.tasks.call_orders',
        'schedule': 1800,
    },
    'planner_2': {
        'task': 'project.tasks.call_inventory',
        'schedule': 900,
    },
}

I used the following command to run with beat:

 celery -A project worker -l info --concurrency=3 --beat -E

Right now it is only a single queue with only one worker running.

My question is how to run celery with multiple workers and single queue so that tasks are executed in parallel using multiprocessing without duplication?

I looked up on the internet, how to run celery with multiprocessing. According to this article:

celery worker -l info -P processes -c 16 will result in a single message consumer delegating work to 16 OS-level pool processes. Each OS-level process can be assigned to different CPU in a multicore environment, and as such it will process tasks in parallel, but it will not consume messages in parallel.

can using the -p processes argument solve my problem? Also but what is meant by, "it will process tasks in parallel, but it will not consume messages in parallel"?

来源:https://stackoverflow.com/questions/42806765/celery-how-to-implement-single-queue-with-multiple-workers-executing-in-parallel

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!