Where does Django-celery/RabbitMQ store task results?

a 夏天 提交于 2019-12-11 05:03:39

问题


My celery database backend settings are:

CELERY_RESULT_BACKEND = "database"
CELERY_RESULT_DBURI = "mysqlite.db"

I am using RabbitMQ as my messager.

It doesn't seem like any results are getting stored in the db, and yet I can read the results after the task is complete. Are they in memory or a RabbitMQ cache?

I haven't tried reading the same result multiple times, so maybe its a read once then poof!


回答1:


CELERY_RESULT_DBURI is for the sqlalchemy result backend, not the Django one. The Django one always uses the default database configured in the DATABASES setting (or the DATABASE_* settings if on older Django versions)




回答2:


my celery daemons work just fine, but I'm having difficulties with collecting task results. task_result.get() leads a timeout. and task.state is always PENDING..(but jobs are completed) i tried separate sqlite dbs, a single postgres db shared by workers. but i still cant get results. CELERY_RESULT_DBURI seems useless to me (for celery 2.5 )i think it's a newer configuration. Any suggestions are welcomed...


EDIT: it's all my fault: i give extra parameters to my tasks with decorators, ignore_results=True Parameter create this problem. I deleted this key and it works like a charm :)



来源:https://stackoverflow.com/questions/10238477/where-does-django-celery-rabbitmq-store-task-results

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!