Airflow Remote logging not working

心已入冬 提交于 2019-12-08 01:38:23

问题


I have a up and running Apache - Airflow 1.8.1 instance.

I got a working connection (and it's ID) to write to Google Cloud Storage and my airflow user has the permission to write to the bucket.

I try to use the remote log storage functionality by adding

remote_base_log_folder = 'gs://my-bucket/log'

remote_log_conn_id = 'my_working_conn_id'

And that's all (I didn't touch any configuration but that)

I restarted all the services but the log aren't uploading to gcs (my bucket it's still empty) and my filesystem space is still decreasing.

Have you enabled successfully remote log with gcs? If yes, what did you change / do?


回答1:


I manage to get the remote log to GCS. First, you need to give the service account permission to write to GCS bucket.

This is my GCP connection set up:

Then, edit the airflow.cfg file:

remote_base_log_folder = gs://my-backup/airflow_logs
remote_log_conn_id = my_gcp_conn

After editing the config file, you need to re-initialize it again:

airflow initdb

# start the web server, default port is 8080
airflow webserver -p 8080

Testing by turning on the "tutorial" DAG, you should be able to see the logs both locally and remotely in GCS:



来源:https://stackoverflow.com/questions/46293020/airflow-remote-logging-not-working

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!