airflow-scheduler

Dynamic schedule_interval in Airflow

北城余情 提交于 2021-02-08 07:21:02
问题 I try to run a dynamic schedule_interval in Airflow like below. But it works only when I run the dag manually. Could you please help me say, if the dynamic schedule_interval could be a reason, why the dag does not run automatically or is there any other reasons for that? if datetime.today().day == 1: schedule_interval = '00 07 * * *' else: schedule_interval = '00 07 * * 1' Thank you! 回答1: You shouldn't set the schedule_interval to be dynamic like this as it can lead to unexpected results (as

Dynamic schedule_interval in Airflow

不羁的心 提交于 2021-02-08 07:19:40
问题 I try to run a dynamic schedule_interval in Airflow like below. But it works only when I run the dag manually. Could you please help me say, if the dynamic schedule_interval could be a reason, why the dag does not run automatically or is there any other reasons for that? if datetime.today().day == 1: schedule_interval = '00 07 * * *' else: schedule_interval = '00 07 * * 1' Thank you! 回答1: You shouldn't set the schedule_interval to be dynamic like this as it can lead to unexpected results (as

Problem with start date and scheduled date in Apache airflow

情到浓时终转凉″ 提交于 2021-02-05 09:15:46
问题 I am working with Apache airflow and I have a problem with the scheduled day and the starting day. I want a dag to run every day at 8:00 AM UTC. So, what I did is: default_args = { 'owner': 'airflow', 'depends_on_past': False, 'start_date': datetime(2020, 12, 7, 10, 0,0), 'email': ['example@emaiil.com'], 'email_on_failure': True, 'email_on_retry': False, 'retries': 1, 'retry_delay': timedelta(hours=5) } #never run dag = DAG(dag_id='id', default_args=default_args, schedule_interval='0 8 * * *'

Airflow Cluster Policy is not getting invoked

廉价感情. 提交于 2021-01-28 11:15:53
问题 I am trying to setup and understand custom policy. Not sure what I am doing wrong however, following this is not working. Airflow Version : 1.10.10 Expected result: it should throw exception if I try to run DAG with default_owner Actual Result: no such exception /root/airflow/config/airflow_local_settings.py class PolicyError(Exception): pass def cluster_policy(task): print("task_instance_mutation_hook") raise PolicyError def task_instance_mutation_hook(ti): print("task_instance_mutation_hook

How to pass parameters to Airflow on_success_callback and on_failure_callback

纵然是瞬间 提交于 2021-01-02 07:59:36
问题 I have implemented email alerts on success and failure using on_success_callback and on_failure_callback. According to Airflow documentation, a context dictionary is passed as a single parameter to this function. How can I pass another parameter to these callback methods? Here is my code from airflow.utils.email import send_email_smtp def task_success_alert(context): subject = "[Airflow] DAG {0} - Task {1}: Success".format( context['task_instance_key_str'].split('__')[0], context['task

How to pass parameters to Airflow on_success_callback and on_failure_callback

我的未来我决定 提交于 2021-01-02 07:54:28
问题 I have implemented email alerts on success and failure using on_success_callback and on_failure_callback. According to Airflow documentation, a context dictionary is passed as a single parameter to this function. How can I pass another parameter to these callback methods? Here is my code from airflow.utils.email import send_email_smtp def task_success_alert(context): subject = "[Airflow] DAG {0} - Task {1}: Success".format( context['task_instance_key_str'].split('__')[0], context['task

Unable to execute Airflow KubernetesExecutor

被刻印的时光 ゝ 提交于 2020-12-31 13:48:07
问题 Following the project from here, I am trying to integrate airflow kubernetes executor using NFS server as backed storage PV. I've a PV airflow-pv which is linked with NFS server. Airflow webserver and scheduler are using a PVC airflow-pvc which is bound with airflow-pv . I've placed my dag files in NFS server /var/nfs/airflow/development/<dags/logs> . I can see newly added DAGS in webserver UI aswell. However when I execute a DAG from UI, the scheduler fires a new POD for that tasks BUT the

Unable to execute Airflow KubernetesExecutor

﹥>﹥吖頭↗ 提交于 2020-12-31 13:43:14
问题 Following the project from here, I am trying to integrate airflow kubernetes executor using NFS server as backed storage PV. I've a PV airflow-pv which is linked with NFS server. Airflow webserver and scheduler are using a PVC airflow-pvc which is bound with airflow-pv . I've placed my dag files in NFS server /var/nfs/airflow/development/<dags/logs> . I can see newly added DAGS in webserver UI aswell. However when I execute a DAG from UI, the scheduler fires a new POD for that tasks BUT the

Unable to execute Airflow KubernetesExecutor

别等时光非礼了梦想. 提交于 2020-12-31 13:40:28
问题 Following the project from here, I am trying to integrate airflow kubernetes executor using NFS server as backed storage PV. I've a PV airflow-pv which is linked with NFS server. Airflow webserver and scheduler are using a PVC airflow-pvc which is bound with airflow-pv . I've placed my dag files in NFS server /var/nfs/airflow/development/<dags/logs> . I can see newly added DAGS in webserver UI aswell. However when I execute a DAG from UI, the scheduler fires a new POD for that tasks BUT the

Unable to execute Airflow KubernetesExecutor

我是研究僧i 提交于 2020-12-31 13:39:59
问题 Following the project from here, I am trying to integrate airflow kubernetes executor using NFS server as backed storage PV. I've a PV airflow-pv which is linked with NFS server. Airflow webserver and scheduler are using a PVC airflow-pvc which is bound with airflow-pv . I've placed my dag files in NFS server /var/nfs/airflow/development/<dags/logs> . I can see newly added DAGS in webserver UI aswell. However when I execute a DAG from UI, the scheduler fires a new POD for that tasks BUT the