airflow

Unable to execute Airflow KubernetesExecutor

我是研究僧i 提交于 2020-12-31 13:39:59
问题 Following the project from here, I am trying to integrate airflow kubernetes executor using NFS server as backed storage PV. I've a PV airflow-pv which is linked with NFS server. Airflow webserver and scheduler are using a PVC airflow-pvc which is bound with airflow-pv . I've placed my dag files in NFS server /var/nfs/airflow/development/<dags/logs> . I can see newly added DAGS in webserver UI aswell. However when I execute a DAG from UI, the scheduler fires a new POD for that tasks BUT the

Unable to execute Airflow KubernetesExecutor

我的梦境 提交于 2020-12-31 13:38:29
问题 Following the project from here, I am trying to integrate airflow kubernetes executor using NFS server as backed storage PV. I've a PV airflow-pv which is linked with NFS server. Airflow webserver and scheduler are using a PVC airflow-pvc which is bound with airflow-pv . I've placed my dag files in NFS server /var/nfs/airflow/development/<dags/logs> . I can see newly added DAGS in webserver UI aswell. However when I execute a DAG from UI, the scheduler fires a new POD for that tasks BUT the

Unable to execute Airflow KubernetesExecutor

江枫思渺然 提交于 2020-12-31 13:37:08
问题 Following the project from here, I am trying to integrate airflow kubernetes executor using NFS server as backed storage PV. I've a PV airflow-pv which is linked with NFS server. Airflow webserver and scheduler are using a PVC airflow-pvc which is bound with airflow-pv . I've placed my dag files in NFS server /var/nfs/airflow/development/<dags/logs> . I can see newly added DAGS in webserver UI aswell. However when I execute a DAG from UI, the scheduler fires a new POD for that tasks BUT the

Airflow 1.10 Config Core hostname_callable - How To Set?

六月ゝ 毕业季﹏ 提交于 2020-12-31 07:37:02
问题 I'm regards to my previous Stackoverflow post here I've finally upgraded from Airflow version 1.9 to 1.10 since it's now released on PyPi. Using their release guide here I got Airflow 1.10 working. Now I inspected their udpates to 1.10 to see how they addressed the bug discovered in Airflow version 1.9 when run on an AWS EC2-Instance. And I found that they replaced all functions that got the servers IP address with a call to this new Airflow class's function get_hostname https://github.com

Airflow 1.10 Config Core hostname_callable - How To Set?

安稳与你 提交于 2020-12-31 07:36:19
问题 I'm regards to my previous Stackoverflow post here I've finally upgraded from Airflow version 1.9 to 1.10 since it's now released on PyPi. Using their release guide here I got Airflow 1.10 working. Now I inspected their udpates to 1.10 to see how they addressed the bug discovered in Airflow version 1.9 when run on an AWS EC2-Instance. And I found that they replaced all functions that got the servers IP address with a call to this new Airflow class's function get_hostname https://github.com

Efficient way to deploy dag files on airflow

只愿长相守 提交于 2020-12-27 07:38:22
问题 Are there any best practices that are followed for deploying new dags to airflow? I saw a couple of comments on the google forum stating that the dags are saved inside a GIT repository and the same is synced periodically to the local location in the airflow cluster. Regarding this approach, I had a couple of questions Do we maintain separate dag files for separate environments? (testing. production) How to handle rollback of an ETL to an older version in case the new version has a bug? Any

Running `airflow scheduler` launches 33 scheduler processes

﹥>﹥吖頭↗ 提交于 2020-12-25 02:32:21
问题 When using LocalExecutor with a MySQL backend, running airflow scheduler on my Centos 6 box creates 33 scheduler processes, e.g. deploy 55362 13.5 1.8 574224 73272 ? Sl 18:59 7:42 /usr/local/bin/python2.7 /usr/local/bin/airflow scheduler deploy 55372 0.0 1.5 567928 60552 ? Sl 18:59 0:00 /usr/local/bin/python2.7 /usr/local/bin/airflow scheduler deploy 55373 0.0 1.5 567928 60540 ? Sl 18:59 0:00 /usr/local/bin/python2.7 /usr/local/bin/airflow scheduler ... These are distinct from Executor

Weird Error in kubernetes: “starting container process caused ”exec: \“/usr/bin/dumb-init\”: stat /usr/bin/dumb-init: no such file or directory"

江枫思渺然 提交于 2020-12-13 03:24:26
问题 I built a customised Docker image of airflow following this document: "https://github.com/puckel/docker-airflow". Built and run in my local VM. Everything was successful and airflow was up. Pushed the image to ACR (Azure container registry) and launched it in aks via stable helm chart. Referred this link "https://github.com/helm/charts/tree/master/stable/airflow". Now suddenly in kubernetes the pods are not up and fails out with the below error. Error: failed to start container "airflow

Kubernetes Ingress + Apache airflow

耗尽温柔 提交于 2020-12-13 03:07:38
问题 Can you please help me? I'm trying to start Apache airflow in Kubernetes (AWS), in vpc. I'm using helm stable/airflow 7.1.1 Everything starts ok. But to get access to web interface I need to expose it via ingress ELB. I have this setup. The rule for airflow looks like this: apiVersion: v1 items: - apiVersion: extensions/v1beta1 kind: Ingress metadata: annotations: kubernetes.io/ingress.class: nginx nginx.ingress.kubernetes.io/connection-proxy-header: upgrade nginx.ingress.kubernetes.io

Airflow DockerOperator: connect sock.connect(self.unix_socket) FileNotFoundError: [Errno 2] No such file or directory

隐身守侯 提交于 2020-12-12 11:58:14
问题 I am trying to get DockerOperator work with Airflow on my Mac. I am running Airflow based on Puckel with small modifications. Dockerfile build as puckel-airflow-with-docker-inside: FROM puckel/docker-airflow:latest USER root RUN groupadd --gid 999 docker \ && usermod -aG docker airflow USER airflow docker-compose-CeleryExecutor.yml.: version: '2.1' services: redis: image: 'redis:5.0.5' postgres: image: postgres:9.6 environment: - POSTGRES_USER=airflow - POSTGRES_PASSWORD=airflow - POSTGRES_DB