问题
I tried all this methods and nothing works :
In log4j file -
log4j.logger.org=OFF
log4j.rootCategory=ERROR, console
log4j.rootCategory=OFF, console
In code :
#option 1
Logger.getLogger("org.apache.spark").setLevel(Level.OFF)
#option 2
sparkContext.setLogLevel("OFF")
#option 3
val rootLogger: Logger = Logger.getRootLogger()
rootLogger.setLevel(Level.OFF)
And yes also tried by putting it after spark context object also before.Nothing seems working.
What am I missing ?
Or Is there another way to set the log levels ?
回答1:
You should be able to do it with something like this:
spark = SparkSession.builder.getOrCreate();
spark.sparkContext().setLogLevel("OFF");
https://spark.apache.org/docs/2.3.0/api/java/org/apache/spark/SparkContext.html#setLogLevel-java.lang.String-
Can you share the rest of the code and where you're running it?
回答2:
This should change your log level to OFF if you declare it before SparkSession
object creation
import org.apache.log4j.{Level, Logger}
Logger.getLogger("org").setLevel(Level.OFF)
val spark = SparkSession.builder().appName("test").master("local[*]").getOrCreate()
Hope this helps!
来源:https://stackoverflow.com/questions/59848344/how-to-change-log-level-in-spark