问题
I am using Airflow 1.7.1.3 installed using pip
I would like to limit the logging to ERROR level for the workflow being executed by the scheduler. Could not find anything beyond setting log files location in the settings.py file.
Also the online resources led me to this google group discussion here but not much info here as well
Any idea how to control logging in Airflow?
回答1:
The logging functionality and its configuration will be changed in version 1.9 with this commit
回答2:
I tried below work around and it seems to be working to set LOGGING_LEVEL
outside of settings.py
:
Update
settings.py
:Remove or comment line:
LOGGING_LEVEL = logging.INFO
Add line:
LOGGING_LEVEL = os.path.expanduser(conf.get('core', 'LOGGING_LEVEL'))
Update
airflow.cfg
configuration file:Add line under
[core]
:logging_level = WARN
Restart
webserver
andscheduler
services
Use environment vaiable
AIRFLOW__CORE__LOGGING_LEVEL=WARN
.
See the official docs for details.
回答3:
Only solution I am aware of is changing LOGGING_LEVEL
in settings.py
file. Default level is set to INFO
.
AIRFLOW_HOME = os.path.expanduser(conf.get('core', 'AIRFLOW_HOME'))
SQL_ALCHEMY_CONN = conf.get('core', 'SQL_ALCHEMY_CONN')
LOGGING_LEVEL = logging.INFO
DAGS_FOLDER = os.path.expanduser(conf.get('core', 'DAGS_FOLDER'))
回答4:
As @Dimo Boyadzhiev pointed the change, Adding the more info path for documentaion.
File - $AIRFLOW_HOME/airflow.cfg
# Logging level
logging_level = INFO
fab_logging_level = WARN
来源:https://stackoverflow.com/questions/42173554/apache-airflow-control-over-logging-disable-adjust-logging-level