How to change log level in spark?

我是研究僧i 提交于 2020-02-16 10:40:10

问题


I tried all this methods and nothing works :

In log4j file -

log4j.logger.org=OFF

log4j.rootCategory=ERROR, console
log4j.rootCategory=OFF, console

In code :

#option 1
Logger.getLogger("org.apache.spark").setLevel(Level.OFF)

#option 2
sparkContext.setLogLevel("OFF")

#option 3
val rootLogger: Logger = Logger.getRootLogger()
rootLogger.setLevel(Level.OFF)

And yes also tried by putting it after spark context object also before.Nothing seems working.
What am I missing ? Or Is there another way to set the log levels ?


回答1:


You should be able to do it with something like this:

spark = SparkSession.builder.getOrCreate();
spark.sparkContext().setLogLevel("OFF");

https://spark.apache.org/docs/2.3.0/api/java/org/apache/spark/SparkContext.html#setLogLevel-java.lang.String-

Can you share the rest of the code and where you're running it?




回答2:


This should change your log level to OFF if you declare it before SparkSession object creation

import org.apache.log4j.{Level, Logger}
Logger.getLogger("org").setLevel(Level.OFF)

val spark = SparkSession.builder().appName("test").master("local[*]").getOrCreate()

Hope this helps!



来源:https://stackoverflow.com/questions/59848344/how-to-change-log-level-in-spark

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!