Billing on bigquery

霸气de小男生 提交于 2020-03-05 03:24:11

问题


Hi i have installed airflow on docker. what i see now is callback with default doenst work. when job fails it doenst call the function mentioned. i have to add callback to each and every task and then it works. Do you recognise this issue? what is the solution?

default_args['on_failure_callback'] = slack_failed_task_callback

Further i noticed that bash environment variable which i have set are not inherited to python operator(python operator is derived from bash operator if i am not wrong).

bashEnv = {
    'EXECUTION_DATE': "{{ ds }}"
}

default_args = {
    'owner': 'airflow',
    'depends_on_past': False,
    'start_date': datetime(2015, 6, 1),
    'email': ['datalabs@datlinq.com'],
    'email_on_failure': False,
    'email_on_retry': False,
    'retries': 0,
    'retry_delay': timedelta(minutes=5),
    'concurrency': 16,
    'max_active_runs': 1,
    'env': bashEnv,
    'provide_context': True,
    'xcom_push': True
}

Python operator function

def dummy_func(**kwargs):
    # print(dag_run['dag_id'], dag_run['conf']) 
    print(os.environ['EXECUTION_DATE'])
    return None

Python operator

run_python_task = PythonOperator(
    task_id='run_python',
    execution_timeout=timedelta(minutes=30),
    python_callable=dummy_func,
    provide_context=True, 
    on_failure_callback=slack_failed_task_callback,
    xcom_push=True,
    dag=dag)

behaviour of airflow on docker is different? (edited)

来源:https://stackoverflow.com/questions/60351512/billing-on-bigquery

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!