Configuring Apache Spark Logging with Scala and logback

情到浓时终转凉″ 提交于 2019-12-01 07:43:58

I do not know if you use sbt or maven but that is where it should all start. Myself I use sbt so I will give you an example how we have solved this problem.

1. Apache Spark uses log4j 1.2.xx

That is true and it is really problematic if you do not want to use the same logging implementation.But there is help!

First, exclude the following libs from spark dependencies:

  • log4j
  • slf4j-log4j12

For sbt (using sbt-assembly) it looks like this:

lazy val spark16 = Seq("spark-core", "spark-sql", "spark-hive")
  .map("org.apache.spark" %% _ % "1.6.1")
  .map(_.excludeAll(
    ExclusionRule(name = "log4j"),
    ExclusionRule(name = "slf4j-log4j12")
  ))

2. Redirect log4j logging to slf4j

A detailed description can be found here: https://www.slf4j.org/legacy.html
And the module that is in our interest is: log4j-over-slf4j

The log4j-over-slf4j module contains replacements of most widely used log4j classes, namely org.apache.log4j.Category, org.apache.log4j.Logger, org.apache.log4j.Priority, org.apache.log4j.Level, org.apache.log4j.MDC, and org.apache.log4j.BasicConfigurator. These replacement classes redirect all work to their corresponding SLF4J classes.

So we can have all the logs redirected back to slf4j from where some other logging implementation could pick it up.

Easy, simply add this dependency to your application

"org.slf4j" % "log4j-over-slf4j" % "1.7.25"

3. Add desired logging implementation

In our case it was (like yours) logback, so we added it as dependency:

"ch.qos.logback" % "logback-classic" % "1.2.3"

Add some logback.xml configuration to your classpath, for example in src/main/resources and enjoy!

spark-submit

If you need help using Logback while deploying your app with spark-submit please follow this answer: https://stackoverflow.com/a/45480145/1549135

BaBa Somanath

I have used the following imports:

import org.slf4j.Logger
import org.slf4j.LoggerFactory

Sample code as shown below.

Object SparkCode {

val logger = LoggerFactory.getLogger(this.getClass.getName)

  def main(args: Array[String]) {

  logger.info("Connection Started . ")

  // Create Spark Context and go on..

  }

}

And you are sorted.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!