问题
Using celery (3.1.8 + ) with django 1.6 all tasks are defined to ignore results (is this the correct syntax?)
@shared_task(ignore_result=True)
def somefunc():
pass
When I look at the rabbitmq queues I see more and more queues created by celery with names like:
19926fa9965e40c19ed9640c2b42ce1e
and contain one message (similar to the following):
correlation_id: 19926fa9-965e-40c1-9ed9-640c2b42ce1e priority: 0 delivery_mode: 2 headers: content_encoding: binary content_type: application/x-python-serialize Payload 118 bytes Encoding: base64 gAJ9cQEoVQZzdGF0dXNxAlUHU1VDQ0VTU3EDVQl0cmFjZWJhY2txBE5VBnJlc3VsdHEFTlUHdGFza19pZHEGVSQxOTkyNmZhOS05NjVlLTQwYzEtOWVkOS02 NDBjMmI0MmNlMWVxB1UIY2hpbGRyZW5xCF11Lg==
If I set celery to "always eager" mode then the problem is solved, but this is obviously not a good solution for a production server.
Any Clues? Is this connected to the ignore_result option? and there is some missing task somewhere? something else?
Thanks for the help
回答1:
I'm not sure how django or your code defines the @shared_task
decorator, but have you tried explicitly setting the queue within it?
@shared_task(ignore_results=True, queue="myexamplequeue")
This would apply the message to the myexamplequeue
whenever you call .delay()
or .apply_async()
on somefunc
.
来源:https://stackoverflow.com/questions/24910613/celery-keeps-creating-rabbitmq-queues-pilling-all-over