How to mark an Airflow DAG run as failed if any task fails?

后端 未结 2 1484
梦毁少年i
梦毁少年i 2021-02-19 15:48

Is it possible to make an Airflow DAG fail if any task fails?

I usually have some cleaning up tasks at the end of a DAG and as it is now, whenever the last task succeeds

相关标签:
2条回答
  • 2021-02-19 16:13

    Another solution can be to add a final PythonOperator that checks the status of all tasks in this run:

    final_status = PythonOperator(
        task_id='final_status',
        provide_context=True,
        python_callable=final_status,
        trigger_rule=TriggerRule.ALL_DONE, # Ensures this task runs even if upstream fails
        dag=dag,
    )
    
    def final_status(**kwargs):
        for task_instance in kwargs['dag_run'].get_task_instances():
            if task_instance.current_state() != State.SUCCESS and \
                    task_instance.task_id != kwargs['task_instance'].task_id:
                raise Exception("Task {} failed. Failing this DAG run".format(task_instance.task_id))
    
    0 讨论(0)
  • 2021-02-19 16:32

    Facing a similar problem. It is not a bug but it could be a nice feature to add this property to Dag.

    As a workaround, I you can push a XCOM variable during the task that is allowed to fail and in the downstream tasks do something like

    if ti.xcom_pull(key='state', task_ids=task_allowed_to_fail_id) == 'FAILED': raise ValueError('Force failure because upstream task has failed')

    0 讨论(0)
提交回复
热议问题