Airflow + celery or dask. For what, when?

為{幸葍}努か 提交于 2019-12-04 17:44:17

问题


I read in the official Airflow documentation the following:

What does this mean exactly? What do the authors mean by scaling out? That is, when is it not enough to use Airflow or when would anyone use Airflow in combination with something like Celery? (same for dask)


回答1:


In Airflow terminology an "Executor" is the component responsible for running your task. The LocalExecutor does this by spawning threads on the computer Airflow runs on and lets the thread execute the task.

Naturally your capacity is then limited by the available resources on the local machine. The CeleryExecutor distributes the load to several machines. The executor itself publishes a request to execute a task to a queue, and one of several worker nodes picks up the request and executes it. You can now scale the cluster of worker nodes to increase overall capacity.

Finally, and not ready yet, there's a KubernetesExecutor in the works (link). This will run tasks on a Kubernetes cluster. This will not only give your tasks complete isolation since they're run in containers, you can also leverage the existing capabilities in Kubernetes to for instance auto scale your cluster so that you always have an optimal amount of resources available.




回答2:


You may enjoy reading this comparison of dask to celery/airflow task managers http://matthewrocklin.com/blog/work/2016/09/13/dask-and-celery

Since you are not asking a specific question, general reading like that should be informative, and maybe you can clarify what you are after.



来源:https://stackoverflow.com/questions/49310136/airflow-celery-or-dask-for-what-when

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!