问题
I'm new to airflow and celery, and I have finished drawing dag by now, but I want to run task in two computers which are in the same subnet, I want to know how to modify the airflow.cfg. Some examples could be better. Thanks to any answers orz.
回答1:
The Airflow documentation covers this quite nicely:
First, you will need a celery backend. This can be for example Redis or RabbitMQ. Then, the executor parameter in your airflow.cfg should be set to CeleryExecutor
.
Then, in the celery
section of the airflow.cfg, set the broker_url
to point to your celery backend (e.g. redis://your_redis_host:your_redis_port/1).
Point celery_result_backend
to a sql database (you can use the same as your main airflow db).
Then, on your worker machines simply kick off airflow worker and your jobs should start on the two machines.
来源:https://stackoverflow.com/questions/45129192/how-to-use-airflow-with-celery