问题
My celery database backend settings are:
CELERY_RESULT_BACKEND = "database"
CELERY_RESULT_DBURI = "mysqlite.db"
I am using RabbitMQ as my messager.
It doesn't seem like any results are getting stored in the db, and yet I can read the results after the task is complete. Are they in memory or a RabbitMQ cache?
I haven't tried reading the same result multiple times, so maybe its a read once then poof!
回答1:
CELERY_RESULT_DBURI is for the sqlalchemy result backend, not the Django one. The Django one always uses the default database configured in the DATABASES setting (or the DATABASE_* settings if on older Django versions)
回答2:
my celery daemons work just fine, but I'm having difficulties with collecting task results. task_result.get() leads a timeout. and task.state is always PENDING..(but jobs are completed) i tried separate sqlite dbs, a single postgres db shared by workers. but i still cant get results. CELERY_RESULT_DBURI seems useless to me (for celery 2.5 )i think it's a newer configuration. Any suggestions are welcomed...
EDIT: it's all my fault: i give extra parameters to my tasks with decorators, ignore_results=True Parameter create this problem. I deleted this key and it works like a charm :)
来源:https://stackoverflow.com/questions/10238477/where-does-django-celery-rabbitmq-store-task-results