Celery task results not persisted with rpc

て烟熏妆下的殇ゞ 提交于 2019-12-07 08:25:37

问题


I have been trying to get Celery task results to be routed to another process by making results persisted to a queue and another process can pick results from queue. So, have configured Celery as CELERY_RESULT_BACKEND = 'rpc', but still Python function returned value is not persisted to queue.

Not sure if any other configuration or code change required. Please help.

Here is the code example:

celery.py

from __future__ import absolute_import

from celery import Celery

app = Celery('proj',
         broker='amqp://',
         backend='rpc://',
         include=['proj.tasks'])

# Optional configuration, see the application user guide.
app.conf.update(
    CELERY_RESULT_BACKEND = 'rpc',
    CELERY_RESULT_PERSISTENT = True,
    CELERY_TASK_SERIALIZER = 'json',
    CELERY_RESULT_SERIALIZER = 'json'
)

if __name__ == '__main__':
    app.start()

tasks.py

from proj.celery import app

@app.task
def add(x, y):
    return x + y

Running Celery as

celery worker --app=proj -l info --pool=eventlet -c 4

回答1:


Solved by using Pika (Python implementation of the AMQP 0-9-1 protocol - https://pika.readthedocs.org) to post results back to celeryresults channel



来源:https://stackoverflow.com/questions/34541282/celery-task-results-not-persisted-with-rpc

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!