Make a Celery task that waits for a signal?

强颜欢笑 提交于 2021-01-27 12:55:16

问题


Is it possible to create Celery task that just waits for a signal? I have this scenario:

  • Scrapyd in one virtualenv on remote machine A
  • Django project with Celery worker node in another virtualenv on remote machine A
  • The same Django project with Celery, but in another virtualenv on local machine B

How I use this setup:

  1. I would send a task chain from Django on machine B
  2. Let the task chain be consumed by the worker node on machine A.
  3. In the first subtask of the task chain, I would schedule a crawl using Scrapyd's JSON over HTTP API, and pass the Celery task ID to the crawler as an HTTP request parameter.
  4. I then want this first subtask to just wait for some kind of signal.
  5. Scrapyd does its thing and runs the spider.
  6. Once the spider is done crawling, I want it to send a signal, maybe by JSON over HTTP or by a Django management command, to the subtask that has been waiting for the signal.

Is this doable?

I would just need code snippets to show me how to wait for a signal in a subtask, and how to restore a task from the task ID and send a signal to it.

来源:https://stackoverflow.com/questions/20061858/make-a-celery-task-that-waits-for-a-signal

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!