Temporary queue made in Celery

前端 未结 5 958
既然无缘
既然无缘 2020-12-14 08:53

I am using Celery with RabbitMQ. Lately, I have noticed that a large number of temporary queues are getting made.

So, I experimented and found that when a task fails

5条回答
  •  一个人的身影
    2020-12-14 09:24

    It sounds like you're using the amqp as the results backend. From the docs here are the pitfalls of using that particular setup:

    • Every new task creates a new queue on the server, with thousands of tasks the broker may be overloaded with queues and this will affect
      performance in negative ways. If you’re using RabbitMQ then each
      queue will be a separate Erlang process, so if you’re planning to
      keep many results simultaneously you may have to increase the Erlang
      process limit, and the maximum number of file descriptors your OS
      allows
    • Old results will not be cleaned automatically, so you must make sure to consume the results or else the number of queues will eventually go out of control. If you’re running RabbitMQ 2.1.1 or higher you can take advantage of the x-expires argument to queues, which will expire queues after a certain time limit after they are unused. The queue expiry can be set (in seconds) by the CELERY_AMQP_TASK_RESULT_EXPIRES setting (not enabled by default).

    From what I've read in the changelog, this is no longer the default backend in versions >=2.3.0 because users were getting bit in the rear end by this behavior. I'd suggest changing the results backend if this not the functionality you need.

提交回复
热议问题