How to configure the log level of a specific logger using log4j in pyspark?

落花浮王杯 提交于 2020-01-05 04:19:05

问题


From this StackOverflow thread, I know how to obtain and use the log4j logger in pyspark like so:

from pyspark import SparkContext
sc = SparkContext()
log4jLogger = sc._jvm.org.apache.log4j
LOGGER = log4jLogger.LogManager.getLogger('MYLOGGER')
LOGGER.info("pyspark script logger initialized")

Which works fine with the spark-submit script.

My question is how to modify the log4j.properties file to configure the log level for this particular logger or how to configure it dynamically?


回答1:


There are other answers on how to configure log4j via the log4j.properties file, but I haven't seen anyone mention how to do it dynamically, so:

from pyspark import SparkContext
sc = SparkContext()
log4jLogger = sc._jvm.org.apache.log4j
LOGGER = log4jLogger.LogManager.getLogger('MYLOGGER')

# same call as you'd make in java, just using the py4j methods to do so
LOGGER.setLevel(log4jLogger.Level.WARN)

# will no longer print
LOGGER.info("pyspark script logger initialized") 


来源:https://stackoverflow.com/questions/41740750/how-to-configure-the-log-level-of-a-specific-logger-using-log4j-in-pyspark

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!