google-cloud-composer

Airflow (Google Composer) TypeError: can't pickle _thread.RLock objects

狂风中的少年 提交于 2021-02-11 14:29:54
问题 I'm using airflow(Google composer), but experienced some exceptions below TypeError: can't pickle _thread.RLock objects Ooops. ____/ ( ( ) ) \___ /( ( ( ) _ )) ) )\ (( ( )( ) ) ( ) ) ((/ ( _( ) ( _) ) ( () ) ) ( ( ( (_) (( ( ) .((_ ) . )_ ( ( ) ( ( ) ) ) . ) ( ) ( ( ( ( ) ( _ ( _) ). ) . ) ) ( ) ( ( ( ) ( ) ( )) ) _)( ) ) ) ( ( ( \ ) ( (_ ( ) ( ) ) ) ) )) ( ) ( ( ( ( (_ ( ) ( _ ) ) ( ) ) ) ( ( ( ( ( ) (_ ) ) ) _) ) _( ( ) (( ( )( ( _ ) _) _(_ ( (_ ) (_((__(_(__(( ( ( | ) ) ) )_))__))_)___) ((

how to pass query parameter to sql file using bigquery operator

房东的猫 提交于 2021-02-09 11:11:26
问题 I need access the parameter passed by BigqueryOperator in sql file, but I am getting error ERROR - queryParameters argument must have a type <class 'dict'> not <class 'list'> I am using below code: t2 = bigquery_operator.BigQueryOperator( task_id='bq_from_source_to_clean', sql='prepare.sql', use_legacy_sql=False, allow_large_results=True, query_params=[{ 'name': 'threshold_date', 'parameterType': { 'type': 'STRING' },'parameterValue': { 'value': '2020-01-01' } }], destination_dataset_table="{

how to pass query parameter to sql file using bigquery operator

蹲街弑〆低调 提交于 2021-02-09 11:10:50
问题 I need access the parameter passed by BigqueryOperator in sql file, but I am getting error ERROR - queryParameters argument must have a type <class 'dict'> not <class 'list'> I am using below code: t2 = bigquery_operator.BigQueryOperator( task_id='bq_from_source_to_clean', sql='prepare.sql', use_legacy_sql=False, allow_large_results=True, query_params=[{ 'name': 'threshold_date', 'parameterType': { 'type': 'STRING' },'parameterValue': { 'value': '2020-01-01' } }], destination_dataset_table="{

how to pass query parameter to sql file using bigquery operator

会有一股神秘感。 提交于 2021-02-09 11:08:27
问题 I need access the parameter passed by BigqueryOperator in sql file, but I am getting error ERROR - queryParameters argument must have a type <class 'dict'> not <class 'list'> I am using below code: t2 = bigquery_operator.BigQueryOperator( task_id='bq_from_source_to_clean', sql='prepare.sql', use_legacy_sql=False, allow_large_results=True, query_params=[{ 'name': 'threshold_date', 'parameterType': { 'type': 'STRING' },'parameterValue': { 'value': '2020-01-01' } }], destination_dataset_table="{

how to pass query parameter to sql file using bigquery operator

独自空忆成欢 提交于 2021-02-09 11:08:26
问题 I need access the parameter passed by BigqueryOperator in sql file, but I am getting error ERROR - queryParameters argument must have a type <class 'dict'> not <class 'list'> I am using below code: t2 = bigquery_operator.BigQueryOperator( task_id='bq_from_source_to_clean', sql='prepare.sql', use_legacy_sql=False, allow_large_results=True, query_params=[{ 'name': 'threshold_date', 'parameterType': { 'type': 'STRING' },'parameterValue': { 'value': '2020-01-01' } }], destination_dataset_table="{

Unable to delete gcloud composer environment

与世无争的帅哥 提交于 2021-02-08 11:33:33
问题 I'm trying to delete gcloud environments. One did not successfully create (no associated Airflow or Bucket) and one did. When I attempt to delete, I get an error message (after a really long time) of RPC Skipped due to required preoperation not finished yet. The logs don't provide any valuable information, and I wasn't able to find anything wrong in the cluster. The only solution I have found so far is to delete the entire project, but I would prefer not to. Any suggestions would be greatly

Unable to delete gcloud composer environment

主宰稳场 提交于 2021-02-08 11:33:23
问题 I'm trying to delete gcloud environments. One did not successfully create (no associated Airflow or Bucket) and one did. When I attempt to delete, I get an error message (after a really long time) of RPC Skipped due to required preoperation not finished yet. The logs don't provide any valuable information, and I wasn't able to find anything wrong in the cluster. The only solution I have found so far is to delete the entire project, but I would prefer not to. Any suggestions would be greatly

How to set/get airflow variables which are in json format from command line

≡放荡痞女 提交于 2021-02-07 04:35:17
问题 I can't edit values of airflow variables in json format through cloud shell. I am using cloud shell to access my airflow variable params (in json format) and it gives me the complete json when i use following command: gcloud composer environments run composer001 --location us-east1 variables --get params However I want to edit one of the values inside json, how do i access that? I referred to the documentation and various other links on google however could only find how to set variables that

Cloud Composer + Airflow: Setting up DAWs to trigger on HTTP (or should I use Cloud Functions?)

时间秒杀一切 提交于 2021-01-29 19:02:36
问题 Ultimately, what I want to do is have a Python script that runs whenever a HTTP request is created, dynamically. It'd be like: App 1 runs and sends out a webhook, Python script catches the webhook immediately and does whatever it does. I saw that you could do this in GCP with Composer and Airflow. But I'm having several issues following these instrutions https://cloud.google.com/composer/docs/how-to/using/triggering-with-gcf: Running this in Cloud Shell to grant blob signing permissions:

Can some provide me with the schema to recreate dag_run table in airflow-db.?

亡梦爱人 提交于 2021-01-28 12:37:11
问题 I have a google cloud composer environment on GCP and I accidentally deleted the dag_runs table due to which airflow_scheduler kept on crashing and the airflow web-server would not come up. I was able to re-create the dag_run table in airflow-db which stopped the crashing, but i think i did not get the schema right as i get the below error when i manually trigger a dag on airflow webserver. Ooops. ____/ ( ( ) ) \___ /( ( ( ) _ )) ) )\ (( ( )( ) ) ( ) ) ((/ ( _( ) ( _) ) ( () ) ) ( ( ( (_) ((