How to stop INFO messages displaying on spark console?

后端 未结 20 3433
广开言路
广开言路 2020-11-22 13:40

I\'d like to stop various messages that are coming on spark shell.

I tried to edit the log4j.properties file in order to stop these message.

Her

20条回答
  •  无人共我
    2020-11-22 14:14

    In addition to all the above posts, here is what solved the issue for me.

    Spark uses slf4j to bind to loggers. If log4j is not the first binding found, you can edit log4j.properties files all you want, the loggers are not even used. For example, this could be a possible SLF4J output:

    SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/C:/Users/~/.m2/repository/org/slf4j/slf4j-simple/1.6.6/slf4j-simple-1.6.6.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/C:/Users/~/.m2/repository/org/slf4j/slf4j-log4j12/1.7.19/slf4j-log4j12-1.7.19.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.SimpleLoggerFactory]

    So here the SimpleLoggerFactory was used, which does not care about log4j settings.

    Excluding the slf4j-simple package from my project via

    
            ...
            
                ...
                
                    slf4j-simple
                    org.slf4j
                
            
        
    

    resolved the issue, as now the log4j logger binding is used and any setting in log4j.properties is adhered to. F.Y.I. my log4j properties file contains (besides the normal configuration)

    log4j.rootLogger=WARN, stdout
    ...
    log4j.category.org.apache.spark = WARN
    log4j.category.org.apache.parquet.hadoop.ParquetRecordReader = FATAL
    log4j.additivity.org.apache.parquet.hadoop.ParquetRecordReader=false
    log4j.logger.org.apache.parquet.hadoop.ParquetRecordReader=OFF
    

    Hope this helps!

提交回复
热议问题