Why does celery add thousands of queues to rabbitmq that seem to persist long after the tasks completel?

前端 未结 2 1070
萌比男神i
萌比男神i 2020-12-28 17:29

I am using celery with a rabbitmq backend. It is producing thousands of queues with 0 or 1 items in them in rabbitmq like this:

$ sudo rabbitmqctl list_queue         


        
相关标签:
2条回答
  • 2020-12-28 17:48

    Use CELERY_TASK_RESULT_EXPIRES (or on 4.1 CELERY_RESULT_EXPIRES) to have a periodic cleanup task remove old data from rabbitmq.

    http://docs.celeryproject.org/en/master/userguide/configuration.html#std:setting-result_expires

    0 讨论(0)
  • 2020-12-28 17:54

    Celery with the AMQP backend will store task tombstones (results) in an AMQP queue named with the task ID that produced the result. These queues will persist even after the results are drained.

    A couple recommendations:

    • Apply ignore_result=True to every task you can. Don't depend on results from other tasks.
    • Switch to a different backend (perhaps Redis -- it's more efficient anyway): http://docs.celeryproject.org/en/latest/userguide/tasks.html
    0 讨论(0)
提交回复
热议问题