I am trying to run hql file present in cloud storage using airflow script,there are two parameters through which we can pass the path to DataprocHiveOperator :
That's because you are using both query as well as query_uri.
If you are querying using a file, you have to use query_uri and query = None OR you can ignore writing query.
If you are using inline query then you have to use query.
Here is a sample for querying through a file.
HiveInsertingTable = DataProcHiveOperator(task_id='HiveInsertingTable',
gcp_conn_id='google_cloud_default',
queri_ury="gs://us-central1-bucket/data/sample_hql.sql",
cluster_name='cluster-name',
region='us-central1',
dag=dag)