Apache Spark logging within Scala

后端 未结 7 2358
野趣味
野趣味 2020-12-12 19:18

I am looking for a solution to be able to log additional data when executing code on Apache Spark Nodes that could help investigate later some issues that might appear durin

7条回答
  •  余生分开走
    2020-12-12 19:20

    Here is my solution :

    I am using SLF4j (with Log4j binding), in my base class of every spark job I have something like this:

    import org.slf4j.LoggerFactory
    val LOG = LoggerFactory.getLogger(getClass) 
    

    Just before the place where I use LOG in distributed functional code, I copy logger reference to a local constant.

    val LOG = this.LOG
    

    It worked for me!

提交回复
热议问题