Airflow kills my tasks after 1 minute

只谈情不闲聊 提交于 2019-12-23 09:57:15

问题


I have a very simple DAG with two tasks, like following:

default_args = {
    'owner': 'me',
    'start_date': dt.datetime.today(),
    'retries': 0,
    'retry_delay': dt.timedelta(minutes=1)
}

dag = DAG(
    'test DAG',
    default_args=default_args,
    schedule_interval=None
)

t0 = PythonOperator(
    task_id="task 1",
    python_callable=run_task_1,
    op_args=[arg_1, args_2, args_3],
    dag=dag,
    execution_timeout=dt.timedelta(minutes=60)
)

t1 = PythonOperator(
    task_id="task 2",
    python_callable=run_task_2,
    dag=dag,
    execution_timeout=dt.timedelta(minutes=60)
)

t1.set_upstream(t0)

However, when I run it, I see the following in the logs:

[2017-10-17 16:18:35,519] {jobs.py:2083} INFO - Task exited with return code -9

Without any other useful error logs. Anyone seen that before? Did I define my DAG wrongly? Any help appreciated!


回答1:


If the task container doesn't have enough memory for a task, it will fail with error code -9. https://www.astronomer.io/guides/dag-best-practices/




回答2:


Which version of airflow are you using?
From 1.8, airflow is less forgiving scheduler on dynamic start_date, https://github.com/apache/incubator-airflow/blob/master/UPDATING.md#less-forgiving-scheduler-on-dynamic-start_date.
Try to give a specific date.



来源:https://stackoverflow.com/questions/46794860/airflow-kills-my-tasks-after-1-minute

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!