Add n tasks to celery queue and wait for the results

社会主义新天地 提交于 2019-11-27 01:32:51

问题


I would add multiple tasks to celery queue and wait for results. I have various ideas how I would achieve this utilising some form of shared storage (memcached, redis, db, etc.), however, I would have thought it's something that Celery can handle automatically but I can't find any resources online.

Code example

def do_tasks(b):
    for a in b:
        c.delay(a)

    return c.all_results_some_how()

回答1:


For Celery >= 3.0, TaskSet is deprecated in favour of group.

from celery import group
from tasks import add

job = group([
             add.s(2, 2),
             add.s(4, 4),
             add.s(8, 8),
             add.s(16, 16),
             add.s(32, 32),
])

Start the group in the background:

result = job.apply_async()

Wait:

result.join()



回答2:


Task.delay returns AsyncResult. Use AsyncResult.get to get result of each task.

To do that you need to keep references to the tasks.

def do_tasks(b):
    tasks = []
    for a in b:
        tasks.append(c.delay(a))
    return [t.get() for t in tasks]

Or you can use ResultSet:

UPDATE: ResultSet is deprecated, please see @laffuste 's answer.

def do_tasks(b):
    rs = ResultSet([])
    for a in b:
        rs.add(c.delay(a))
    return rs.get()



回答3:


I have a hunch you are not really wanting the delay but the async feature of Celery.

I think you really want a TaskSet:

from celery.task.sets import TaskSet
from someapp.tasks import sometask

def do_tasks(b):
    job = TaskSet([sometask.subtask((a,)) for a in b])
    result = job.apply_async()
    # might want to handle result.successful() == False
    return result.join()


来源:https://stackoverflow.com/questions/26686850/add-n-tasks-to-celery-queue-and-wait-for-the-results

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!