How to create a conditional task in Airflow

后端 未结 2 1368
萌比男神i
萌比男神i 2020-11-28 23:43

I would like to create a conditional task in Airflow as described in the schema below. The expected scenario is the following:

  • Task 1 executes
  • If Task
2条回答
  •  旧时难觅i
    2020-11-29 00:02

    You have to use airflow trigger rules

    All operators have a trigger_rule argument which defines the rule by which the generated task get triggered.

    The trigger rule possibilities:

    ALL_SUCCESS = 'all_success'
    ALL_FAILED = 'all_failed'
    ALL_DONE = 'all_done'
    ONE_SUCCESS = 'one_success'
    ONE_FAILED = 'one_failed'
    DUMMY = 'dummy'
    

    Here is the idea to solve your problem:

    from airflow.operators.ssh_execute_operator import SSHExecuteOperator
    from airflow.utils.trigger_rule import TriggerRule
    from airflow.contrib.hooks import SSHHook
    
    sshHook = SSHHook(conn_id=)
    
    task_1 = SSHExecuteOperator(
            task_id='task_1',
            bash_command=,
            ssh_hook=sshHook,
            dag=dag)
    
    task_2 = SSHExecuteOperator(
            task_id='conditional_task',
            bash_command=,
            ssh_hook=sshHook,
            dag=dag)
    
    task_2a = SSHExecuteOperator(
            task_id='task_2a',
            bash_command=,
            trigger_rule=TriggerRule.ALL_SUCCESS,
            ssh_hook=sshHook,
            dag=dag)
    
    task_2b = SSHExecuteOperator(
            task_id='task_2b',
            bash_command=,
            trigger_rule=TriggerRule.ALL_FAILED,
            ssh_hook=sshHook,
            dag=dag)
    
    task_3 = SSHExecuteOperator(
            task_id='task_3',
            bash_command=,
            trigger_rule=TriggerRule.ONE_SUCCESS,
            ssh_hook=sshHook,
            dag=dag)
    
    
    task_2.set_upstream(task_1)
    task_2a.set_upstream(task_2)
    task_2b.set_upstream(task_2)
    task_3.set_upstream(task_2a)
    task_3.set_upstream(task_2b)
    

提交回复
热议问题