Output Spark application id in the logs with Log4j

喜欢而已 提交于 2019-12-12 21:13:46

问题


I have a custom Log4j file for the Spark application. I would like to output Spark app id along with other attributes like message and date so the JSON string structure would look like this:

{"name":,"time":,"date":,"level":,"thread":,"message":,"app_id":}

Now, this structure looks like this:

{"name":,"time":,"date":,"level":,"thread":,"message":}

How can I define such layout for the Spark driver logs?

My log4j file looks like this:

<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE log4j:configuration SYSTEM "log4j.dtd">
<log4j:configuration xmlns:log4j='http://jakarta.apache.org/log4j/'>

    <appender name="Json" class="org.apache.log4j.ConsoleAppender">
        <layout class="org.apache.hadoop.log.Log4Json">
            <param name="ConversionLayout" value=""/>
        </layout>
    </appender>

    <root>
        <level value="INFO"/>
        <appender-ref ref="Json"/>
    </root>
</log4j:configuration>

回答1:


I doubt that org.apache.hadoop.log.Log4Json can be adjusted for this purpose. According to its javadoc and source code it might be rather cumbersome.

Although it looks like you are using Log4j 1x, its API is quite flexible and we can easily define our own layout by extending org.apache.log4j.Layout.

We'll need a case class that will be transformed into JSON according to the target structure:

case class LoggedMessage(name: String,
                         appId: String,
                         thread: String,
                         time: Long,
                         level: String,
                         message: String)

And Layout might be extended as follows. To access the value of "app_id", we'll use Log4j's Mapped Diagnostic Context

import org.apache.log4j.Layout
import org.apache.log4j.spi.LoggingEvent
import org.json4s.DefaultFormats
import org.json4s.native.Serialization.write

class JsonLoggingLayout extends Layout {
  // required by the API
  override def ignoresThrowable(): Boolean = false
  // required by the API
  override def activateOptions(): Unit = { /* nothing */ }

  override def format(event: LoggingEvent): String = {
    // we are using json4s for JSON serialization
    implicit val formats = DefaultFormats

    // retrieve app_id from Mapped Diagnostic Context
    val appId = event.getMDC("app_id") match {
      case null => "[no_app]" // logged messages outside our app
      case defined: AnyRef => defined.toString
    }
    val message = LoggedMessage("TODO",
                                appId,
                                Thread.currentThread().getName,
                                event.getTimeStamp,
                                event.getLevel.toString,
                                event.getMessage.toString)
    write(message) + "\n"
  }

}

Finally, when the Spark session is created, we put the app_id value into MDC:

import org.apache.log4j.{Logger, MDC}

// create Spark session

MDC.put("app_id", session.sparkContext.applicationId)

logger.info("-------- this is info --------")
logger.warn("-------- THIS IS A WARNING --------")
logger.error("-------- !!! ERROR !!! --------")

This produces following logs:

{"name":"TODO","appId":"local-1550247707920","thread":"main","time":1550247708149,"level":"INFO","message":"-------- this is info --------"}
{"name":"TODO","appId":"local-1550247707920","thread":"main","time":1550247708150,"level":"WARN","message":"-------- THIS IS A WARNING --------"}
{"name":"TODO","appId":"local-1550247707920","thread":"main","time":1550247708150,"level":"ERROR","message":"-------- !!! ERROR !!! --------"}

And, of course, do not forget to refer the implementation in log4j config xml:

<appender name="Json" class="org.apache.log4j.ConsoleAppender">
  <layout class="stackoverflow.q54706582.JsonLoggingLayout" />
</appender>


来源:https://stackoverflow.com/questions/54706582/output-spark-application-id-in-the-logs-with-log4j

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!