Passing arguments to sql template from airflow operator

雨燕双飞 提交于 2020-01-30 08:58:35

问题


If I am using a BigQueryOperator with a SQL Template, how could I pass an argument to the SQL?

File: .sql/query.sql

SELECT * FROM `dataset.{{ task_instance.variable_for_execution }}

File: dag.py

BigQueryOperator(
    task_id='compare_tables',
    sql='./sql/query.sql',
    use_legacy_sql=False,
    dag=dag,
)

回答1:


You can pass an argument in params parameter which can be used in the templated field as follows:

BigQueryOperator(
    task_id='',
    sql='SELECT * FROM `dataset.{{ params.param1 }}',
    params={
        'param1': 'value1',
        'param2': 'value2'
    },
    use_legacy_sql=False,
    dag=dag
)

OR you can have the SQL separate in file:

File: ./sql/query.sql

SELECT * FROM `dataset.{{ params.param1 }}

params parameter's input should be a dictionary. In general, any operator in Airflow can be passed this params parameter.



来源:https://stackoverflow.com/questions/52103717/passing-arguments-to-sql-template-from-airflow-operator

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!