Python+Celery: Chaining jobs?

前端 未结 1 549
刺人心
刺人心 2020-12-07 18:07

The Celery documentation suggests that it\'s a bad idea to have tasks wait on the results of other tasks… But the suggested solution (see “good” heading) leaves a something

相关标签:
1条回答
  • 2020-12-07 18:20

    You can do it with a celery chain. See https://celery.readthedocs.org/en/latest/userguide/canvas.html#chains

    @task()
    def add(a, b):
        time.sleep(5) # simulate long time processing
        return a + b
    

    Chaining job:

    # import chain from celery import chain
    # the result of the first add job will be 
    # the first argument of the second add job
    ret = chain(add.s(1, 2), add.s(3)).apply_async()
    
    # another way to express a chain using pipes
    ret2 = (add.s(1, 2) | add.s(3)).apply_async()
    
    ...
    
    # check ret status to get result
    if ret.status == u'SUCCESS':
        print "result:", ret.get()
    
    0 讨论(0)
提交回复
热议问题