How to dynamically create subdags in Airflow

♀尐吖头ヾ 提交于 2020-02-20 10:52:25

问题


I have a main dag which retrieves a file and splits the data in this file to separate csv files. I have another set of tasks that must be done for each file of these csv files. eg (Uploading to GCS, Inserting to BigQuery) How can I generate a SubDag for each file dynamically based on the number of files? SubDag will define the tasks like Uploading to GCS, Inserting to BigQuery, deleting the csv file)

So right now, this is what it looks like

main_dag = DAG(....)
download_operator = SFTPOperator(dag = main_dag, ...)  # downloads file
transform_operator = PythonOperator(dag = main_dag, ...) # Splits data and writes csv files

def subdag_factory(): # Will return a subdag with tasks for uploading to GCS, inserting to BigQuery.
    ...
    ...

How can I call the subdag_factory for each file generated in transform_operator?


回答1:


I tried creating subdags dynamically as follows

# create and return and DAG
def create_subdag(dag_parent, dag_id_child_prefix, db_name):
    # dag params
    dag_id_child = '%s.%s' % (dag_parent.dag_id, dag_id_child_prefix + db_name)
    default_args_copy = default_args.copy()

    # dag
    dag = DAG(dag_id=dag_id_child,
              default_args=default_args_copy,
              schedule_interval='@once')

    # operators
    tid_check = 'check2_db_' + db_name
    py_op_check = PythonOperator(task_id=tid_check, dag=dag,
                                 python_callable=check_sync_enabled,
                                 op_args=[db_name])

    tid_spark = 'spark2_submit_' + db_name
    py_op_spark = PythonOperator(task_id=tid_spark, dag=dag,
                                 python_callable=spark_submit,
                                 op_args=[db_name])

    py_op_check >> py_op_spark
    return dag

# wrap DAG into SubDagOperator
def create_subdag_operator(dag_parent, db_name):
    tid_subdag = 'subdag_' + db_name
    subdag = create_subdag(dag_parent, tid_prefix_subdag, db_name)
    sd_op = SubDagOperator(task_id=tid_subdag, dag=dag_parent, subdag=subdag)
    return sd_op

# create SubDagOperator for each db in db_names
def create_all_subdag_operators(dag_parent, db_names):
    subdags = [create_subdag_operator(dag_parent, db_name) for db_name in db_names]
    # chain subdag-operators together
    airflow.utils.helpers.chain(*subdags)
    return subdags


# (top-level) DAG & operators
dag = DAG(dag_id=dag_id_parent,
          default_args=default_args,
          schedule_interval=None)

subdag_ops = create_subdag_operators(dag, db_names)

Note that the list of inputs for which subdags are created, here db_names, can either be declared statically in the python file or could be read from external source.

The resulting DAG looks like this

Diving into SubDAG(s)




回答2:


Airflow deals with DAG in two different ways.

  1. One way when you define your DAG in python file and put it into dags_folder. Less changes to the structure of the DAG better. So at first I did wrong. For each file in a directory I generated a DAG each airflow scheduler's heartbeat my code went through all the files and generated a DAG. Pros :) not to much. Cons a lot, if file that were correspondent to a DAG deleted you have no control over a DAG you can't see the steps, you can't restart and so on.

  2. So at some point I've come up with another solution. You have static DAGs (they are still dynamic the script generates them, but their structure, IDes do not change). So instead of one script that walks in directory and generates DAGs. You do two static DAGs, one monitors the directory periodically (*/10 ****), the other one is triggered by the first. So when a new file/files appeared, the first DAG triggers the second one with arg conf. Next code has to be executed for every file in the directory.

        session = settings.Session()
        dr = DagRun(
                    dag_id=dag_to_be_triggered,
                    run_id=uuid_run_id,
                    conf={'file_path': path_to_the_file},
                    execution_date=datetime.now(),
                    start_date=datetime.now(),
                    external_trigger=True)
        logging.info("Creating DagRun {}".format(dr))
        session.add(dr)
        session.commit()
        session.close()
    

The triggered DAG can receive the conf arg and finish all the required tasks for the particular file. To access the conf param use this:

    def work_with_the_file(**context):
        path_to_file = context['dag_run'].conf['file_path'] \
            if 'file_path' in context['dag_run'].conf else None

        if not path_to_file:
            raise Exception('path_to_file must be provided')

Pros all the flexibility and functionality of Airflow

Cons the monitor DAG can be spammy.



来源:https://stackoverflow.com/questions/48947867/how-to-dynamically-create-subdags-in-airflow

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!