问题
Hi i have installed airflow on docker. what i see now is callback with default doenst work. when job fails it doenst call the function mentioned. i have to add callback to each and every task and then it works. Do you recognise this issue? what is the solution?
default_args['on_failure_callback'] = slack_failed_task_callback
Further i noticed that bash environment variable which i have set are not inherited to python operator(python operator is derived from bash operator if i am not wrong).
bashEnv = {
'EXECUTION_DATE': "{{ ds }}"
}
default_args = {
'owner': 'airflow',
'depends_on_past': False,
'start_date': datetime(2015, 6, 1),
'email': ['datalabs@datlinq.com'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 0,
'retry_delay': timedelta(minutes=5),
'concurrency': 16,
'max_active_runs': 1,
'env': bashEnv,
'provide_context': True,
'xcom_push': True
}
Python operator function
def dummy_func(**kwargs):
# print(dag_run['dag_id'], dag_run['conf'])
print(os.environ['EXECUTION_DATE'])
return None
Python operator
run_python_task = PythonOperator(
task_id='run_python',
execution_timeout=timedelta(minutes=30),
python_callable=dummy_func,
provide_context=True,
on_failure_callback=slack_failed_task_callback,
xcom_push=True,
dag=dag)
behaviour of airflow on docker is different? (edited)
来源:https://stackoverflow.com/questions/60351512/billing-on-bigquery