问题
You can use celery to call a task by name, that is registered in a different process (or even on a different machine):
celery.send_task(task_name, args=args, kwargs=kwargs)
(http://celery.readthedocs.org/en/latest/reference/celery.html#celery.Celery.send_task)
I now would like to be able to add a callback that will be executed as soon as the task finished and that will be executed within the process that is calling the task.
My Setup
I have a server A, that runs a django powered website and I use a basic celery setup as described here. I don't run a celery worker on server A.
Then there is server B, that runs (several) celery worker.
So far, this setup seems to work pretty good. I can send tasks on server A and they get executed on the remote server B.
The Problem
The only problem is, that I'm not able to add a callback function.
In the docs it says, that you can add a callback by providing a follow-up task. So I could do something like this:
@celery.task
def result_handler(result):
print "YEAH"
celery.send_task(task_name, args=args, kwargs=kwargs, link=result_handler.s())
This however means, I have to start a worker on server A that registers the task "result_handler". And even if I do that, then the handler will be called in the process spawned by the worker and not the django process, that is calling the task.
The only solution I was able to come up with was an endless loop that checks every 2 seconds if the task is ready or not, but I think there should be a simpler solution.
来源:https://stackoverflow.com/questions/24367254/celery-how-to-add-a-callback-function-when-calling-a-remote-task-with-send-tas