How to set up logging level for Spark application in IntelliJ IDEA?

前端 未结 6 2297
旧时难觅i
旧时难觅i 2021-02-20 04:25

I\'m working on a Scala project in IntelliJ that was created through SBT. The project has Spark as one of its dependencies. I\'m still in the development phase so everything is

相关标签:
6条回答
  • 2021-02-20 04:31

    Put your log4j.properties file under a directory marked as resources, spark will read this log4j configuration.

    0 讨论(0)
  • 2021-02-20 04:37

    For shutting down/setting log level programmatically in spark 2.0+

    Logger.getLogger("org.apache.spark").setLevel(Level.OFF);
    
    0 讨论(0)
  • 2021-02-20 04:37

    Spark by default logs almost everything you would like to see in the logs, however, if you need to change the logging behaviour, you can edit log4j.properties in the conf directory of your Apache Spark configuration. If you're using a prebuilt version, you can find it in /home/coco/Applications/spark-1.4.0-bin-hadoop2.6/conf directory. There is a template file "log4j.properties.template" that you have to copy to "log4j.properties" and edit it depending on your needs. I hope it helps.

    0 讨论(0)
  • 2021-02-20 04:43

    Setting the log level on the SparkContext worked for me under Eclipse

    spark.sparkContext.setLogLevel("WARN")  
    
    0 讨论(0)
  • 2021-02-20 04:44

    If you are working on the local development with IDE, you can change the log level at run-time by:

    LogManager.getRootLogger.setLevel(Level.ALL)

    Ps: Put that line after the SparkContext/ SQLContext was created in your code.

    0 讨论(0)
  • 2021-02-20 04:48

    I would love to figure out how to do this with a project local properties file (an example file would be nice), but I was able to get this done in Spark 2.2 with the following code:

    import org.apache.log4j.{Level, Logger}
    
    object MySparkApp {
    
        def main(args: Array[String]): Unit = {
          Logger.getLogger("org.apache.spark").setLevel(Level.WARN)
    
    0 讨论(0)
提交回复
热议问题