How to mark an Airflow DAG run as failed if any task fails?

喜夏-厌秋 提交于 2019-12-12 10:36:39

问题


Is it possible to make an Airflow DAG fail if any task fails?

I usually have some cleaning up tasks at the end of a DAG and as it is now, whenever the last task succeeds the whole DAG is marked as a success.


回答1:


Another solution can be to add a final PythonOperator that checks the status of all tasks in this run:

final_status = PythonOperator(
    task_id='final_status',
    provide_context=True,
    python_callable=final_status,
    trigger_rule=TriggerRule.ALL_DONE, # Ensures this task runs even if upstream fails
    dag=dag,
)

def final_status(**kwargs):
    for task_instance in kwargs['dag_run'].get_task_instances():
        if task_instance.current_state() != State.SUCCESS and \
                task_instance.task_id != kwargs['task_instance'].task_id:
            raise Exception("Task {} failed. Failing this DAG run".format(task_instance.task_id))



回答2:


Facing a similar problem. It is not a bug but it could be a nice feature to add this property to Dag.

As a workaround, I you can push a XCOM variable during the task that is allowed to fail and in the downstream tasks do something like

if ti.xcom_pull(key='state', task_ids=task_allowed_to_fail_id) == 'FAILED': raise ValueError('Force failure because upstream task has failed')



来源:https://stackoverflow.com/questions/50055582/how-to-mark-an-airflow-dag-run-as-failed-if-any-task-fails

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!