Airflow: Log file isn't local, Unsupported remote log location

强颜欢笑 提交于 2019-11-30 12:13:37
Jaguar

I also faced the same problem.

Setting below variables in airflow.cfg worked for me. Use {hostname} as machine's FQDN {hostname} instead of localhost.

endpoint_url = http://{hostname}:8080

base_url = http://{hostname}:8080

Best of luck!

As you can see in the image-1 there is a timestamp , make sure in your logs you have the folder/file with that timestamp as name ..

You are looking at UI, so first make sure you have log files created in the directory, in my case my log folder looks like

(AIRFLOW-ENV) [cloudera@quickstart dags]$ ll /home/cloudera/workspace/python/airflow_home/logs/my_test_dag/my_sensor_task 
total 8
-rw-rw-rw- 1 cloudera cloudera 3215 Nov 14 08:45 2017-11-12T12:00:00
-rw-rw-rw- 1 cloudera cloudera 2694 Nov 14 08:45 2017-11-14T08:36:06.920727
(AIRFLOW-ENV) [cloudera@quickstart dags]$ 

So my log URL is

http://localhost:8080/admin/airflow/log?task_id=my_sensor_task&dag_id=my_test_dag&execution_date=2017-11-14T08:36:06.920727

When you go to your DAG, and select the GRAPH-VIEW, you can see a dropdown next to "RUN", select the appropriate run, and then in the graph-view below , select the appropriate task/operator and select view-log

I ran into this as well, and had to unpause the tasks.

dags_are_paused_at_creation = False

I also set new dags to default to unpaused in my airflow.cfg

dags_are_paused_at_creation = False
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!