airflow

Airflow DockerOperator: connect sock.connect(self.unix_socket) FileNotFoundError: [Errno 2] No such file or directory

此生再无相见时 提交于 2020-12-12 11:58:08
问题 I am trying to get DockerOperator work with Airflow on my Mac. I am running Airflow based on Puckel with small modifications. Dockerfile build as puckel-airflow-with-docker-inside: FROM puckel/docker-airflow:latest USER root RUN groupadd --gid 999 docker \ && usermod -aG docker airflow USER airflow docker-compose-CeleryExecutor.yml.: version: '2.1' services: redis: image: 'redis:5.0.5' postgres: image: postgres:9.6 environment: - POSTGRES_USER=airflow - POSTGRES_PASSWORD=airflow - POSTGRES_DB

How to add an Airflow Pool via environment variable?

假装没事ソ 提交于 2020-12-11 12:08:33
问题 Just like it is possible to set connections via an environment variable following the name AIRFLOW_CONN_{conn_id} , is there a way to set pools? This is so I can set up a local Docker test environment with all configurations populated. 来源: https://stackoverflow.com/questions/58136365/how-to-add-an-airflow-pool-via-environment-variable

How to add an Airflow Pool via environment variable?

北城以北 提交于 2020-12-11 12:06:17
问题 Just like it is possible to set connections via an environment variable following the name AIRFLOW_CONN_{conn_id} , is there a way to set pools? This is so I can set up a local Docker test environment with all configurations populated. 来源: https://stackoverflow.com/questions/58136365/how-to-add-an-airflow-pool-via-environment-variable

How to add an Airflow Pool via environment variable?

自作多情 提交于 2020-12-11 12:02:51
问题 Just like it is possible to set connections via an environment variable following the name AIRFLOW_CONN_{conn_id} , is there a way to set pools? This is so I can set up a local Docker test environment with all configurations populated. 来源: https://stackoverflow.com/questions/58136365/how-to-add-an-airflow-pool-via-environment-variable

How to add an Airflow Pool via environment variable?

你。 提交于 2020-12-11 11:59:56
问题 Just like it is possible to set connections via an environment variable following the name AIRFLOW_CONN_{conn_id} , is there a way to set pools? This is so I can set up a local Docker test environment with all configurations populated. 来源: https://stackoverflow.com/questions/58136365/how-to-add-an-airflow-pool-via-environment-variable

How to add an Airflow Pool via environment variable?

你说的曾经没有我的故事 提交于 2020-12-11 11:58:12
问题 Just like it is possible to set connections via an environment variable following the name AIRFLOW_CONN_{conn_id} , is there a way to set pools? This is so I can set up a local Docker test environment with all configurations populated. 来源: https://stackoverflow.com/questions/58136365/how-to-add-an-airflow-pool-via-environment-variable

Airflow ExternalTaskSensor execution timeout

本小妞迷上赌 提交于 2020-12-09 04:28:33
问题 I'm using airflow.operators.sensors.ExternalTaskSensor to make one Dag wait for another. dag = DAG( 'dag2', default_args={ 'owner': 'Me', 'depends_on_past': False, 'start_date': start_datetime, 'email': ['me@example.com'], 'email_on_failure': True, 'email_on_retry': False, 'retries': 2, 'retry_delay': timedelta(minutes=10), }, template_searchpath="%s/me/resources/" % DAGS_FOLDER, schedule_interval="{} {} * * *".format(minute, hour), max_active_runs=1 ) wait_for_dag1 = ExternalTaskSensor( task

No module named unusual_prefix_*

三世轮回 提交于 2020-12-08 14:08:20
问题 I tried to run the Python Operator Example in my Airflow installation. The installation has deployed webserver, scheduler and worker on the same machine and runs with no complaints for all non-PytohnOperator tasks. The task fails, complaining that the module "unusual_prefix_*" could not be imported, where * is the name of the file containing the DAG. The full stacktrace: ['/usr/bin/airflow', 'run', 'tutorialpy', 'print_the_context', '2016-08-23T10:00:00', '--pickle', '90', '--local'] [2016-08

Airflow xcom pull only returns string

爷,独闯天下 提交于 2020-12-06 12:34:49
问题 I have an airflow pipeline where I need to get a filename from a pubsub subscription and then import that file into a cloud sql instance. I use the CloudSqlInstanceImportOperator to import the CSV file. This operator needs a body, which contains the filename and other parameters. Since I read that filename during runtime, I also have to define the body during runtime. This all works. But when I pull the body from xcom, it returns a string instead of a python dictionary. So the

Airflow xcom pull only returns string

大兔子大兔子 提交于 2020-12-06 12:31:04
问题 I have an airflow pipeline where I need to get a filename from a pubsub subscription and then import that file into a cloud sql instance. I use the CloudSqlInstanceImportOperator to import the CSV file. This operator needs a body, which contains the filename and other parameters. Since I read that filename during runtime, I also have to define the body during runtime. This all works. But when I pull the body from xcom, it returns a string instead of a python dictionary. So the