I am trying to use Airflow to execute a simple task python.
from __future__ import print_function
from airflow.operators.python_operator import PythonOperator
f
While packaging your dags into a zip as covered in the docs is the only supported solution I have seen, you can also do imports of modules that are inside the dags folder. This is useful if you sync the dags folder automatically using other tools like puppet & git.
I am not clear on your directory structure from the question, so here is an example dags folder based on a typical python project structure:
└── airflow/dags # root airflow dags folder where all dags live
└── my_dags # git repo project root
├── my_dags # python src root (usually named same as project)
│ ├── my_test_globals.py # file I want to import
│ ├── dag_in_package.py
│ └── dags
│ └── dag_in_subpackage.py
├── README.md # also setup.py, LICENSE, etc here
└── dag_in_project_root.py
I have left out the (required [1]) __init__.py files. Note the location of the three example dags. You would almost certainly use only one of these places for all your dags. I include them all here for sake of example because it shouldn't matter for the import. To import my_test_globals from any of them:
from my_dags.my_dags import my_test_globals
I believe this means that airflow runs with the python path set to the dags directory so each subdirectory of the dags folder can be treated as a python package. In my case it was the additional intermediate project root directory getting in the way of doing a typical intra-package absolute import. Thus, we could restructure this airflow project like this:
└── airflow/dags # root airflow dags folder where all dags live
└── my_dags # git repo project root & python src root
├── my_test_globals.py # file I want to import
├── dag_in_package.py
├── dags
│ └── dag_in_subpackage.py
├── README.md # also setup.py, LICENSE, etc here
└── dag_in_project_root.py
So that imports look as we expect them to:
from my_dags import my_test_globals