Celery with rabbitmq creates results multiple queues

孤者浪人 提交于 2021-02-07 05:47:06

问题


I have installed Celery with RabbitMQ. Problem is that for every result that is returned, Celery will create in the Rabbit, queue with the task's ID in the exchange celeryresults.

I still want to have results, but on ONE queue.

my celeryconfig:

from datetime import timedelta
OKER_URL = 'amqp://'
CELERY_RESULT_BACKEND = 'amqp'
#CELERY_IGNORE_RESULT = True
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_ACCEPT_CONTENT=['json', 'application/json']
CELERY_TIMEZONE = 'Europe/Oslo'
CELERY_ENABLE_UTC = True

from celery.schedules import crontab

CELERYBEAT_SCHEDULE = {
    'every-minute': {
        'task': 'tasks.remote',
        'schedule': timedelta(seconds=30),
        'args': (),
    },
}

Is that possible? How?

Thanks!


回答1:


amqp backend creates a new queue for each task. Alternatively, there is a new rpc backend which keeps results in a single queue.

http://docs.celeryproject.org/en/master/whatsnew-3.1.html#new-rpc-result-backend




回答2:


Nothing unusual.

That is how celery works when we use amqp as result backend. It will create a new temporary queue for every result corresponding to each tasks that worker consumes.

If you are not interested in the result, you can try CELERY_IGNORE_RESULT = True setting

If you do want to store the result, then i would recommend using a different result backend like Redis.




回答3:


You say you want Celery to keep the result on one queue. Now, to answer your question, let me ask you one:

How do you expect each producer to check for it's relevant result without reading every single message off the queue to find the one it needs/wants?

In essence, what you want is a database of key-value pairs so that the lookup is O(1). The only way to do that with a queue broker is to create one queue for each "pair".

I understand that having many GUID queues is not neat or pretty, but it's conceptually the only way to do it on a messaging broker.




回答4:


This solution won't keep all the results to ONE queue, but it will at least clean up the extra queues right when you're done with them.

If you use Redis as your backend, when you're done with a result that has created an errant queue, run result.forget(). This will cause both the result and the queue for the result to disappear. This can help you manage the number of queues you have, and prevent OOM issues.



来源:https://stackoverflow.com/questions/20998658/celery-with-rabbitmq-creates-results-multiple-queues

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!