Custom log4j appender in spark executor

前端 未结 3 879
死守一世寂寞
死守一世寂寞 2021-01-02 18:23

I\'m trying to use custom log4j appender inside spark executor, in order to forward all logs to Apache Kafka.

The problem is, log4j is initialized before fatjar\'s c

相关标签:
3条回答
  • 2021-01-02 18:50

    Ended up submitting extra jar with logging deps and loading it before user classpath.

    LOG_JAR="${THISDIR}/../lib/logging.jar"
    spark-submit ...... \
      --files "${LOG4J_CONF},${LOG_JAR}" \
      --conf "spark.driver.extraJavaOptions=-Dlog4j.configuration=`basename ${LOG4J_CONF}`" \
      --conf "spark.driver.extraClassPath=`basename ${LOG_JAR}`" \
      --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=`basename ${LOG4J_CONF}`" \
      --conf "spark.executor.extraClassPath=`basename ${LOG_JAR}`" \
      ...
    

    https://issues.apache.org/jira/browse/SPARK-10881?filter=-2

    0 讨论(0)
  • 2021-01-02 19:04

    Was facing the same issue , I will post what worked for me, it turns out the KafkaLog4jAppenderclass package name changed in kafka 0.9, here is what I did, added following dependency in pom

    <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka-log4j-appender</artifactId>
            <version>0.9.0.0</version>
        </dependency>
    

    and changed my log4j.properties from

    log4j.appender.KAFKA=kafka.producer.KafkaLog4jAppender
    

    to

    log4j.appender.KAFKA=org.apache.kafka.log4jappender.KafkaLog4jAppender 
    
    0 讨论(0)
  • 2021-01-02 19:05

    kafka.producer.KafkaLog4jAppender is in kafka's hadoop-producer.

    so you can add this dependency to fix it.

    <dependency>
        <groupId>org.apache.kafka</groupId>
        <artifactId>hadoop-producer</artifactId>
        <version>0.8.0</version>
    </dependency>
    
    0 讨论(0)
提交回复
热议问题