Spark File Logger in Yarn Mode

*爱你&永不变心* 提交于 2019-12-06 01:01:03

I was able to use a custom logger to append in a local file in Yarn in cluster mode.

First of all in all cluster worker nodes I made available the log4j file in the same directory (e.g. /home/myUser/log4j.custom.properties ) and also created a folder in the same nodes to save the logs in my user path (e.g. /home/myUser/sparkLogs ).

After that, in submit, I pass that file as the driver logger with driver-java-options and this does the trick. I use this submit (the log4j file is the same as before):

/usr/bin/spark2-submit 
--driver-java-options "-Dlog4j.configuration=file:///home/myUser/log4j.custom.properties"
--master yarn --deploy-mode client --driver-memory nG --executor-memory nG
--executor-cores n /home/myUser/sparkScripts/myCode.py
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!