How to customize logging levels for Cassandra and Spark

僤鯓⒐⒋嵵緔 提交于 2019-12-12 15:42:32

问题


I'm trying to customize logging coming from com.datastax.driver for Cassandra and Apache Spark. These libraries output debug logs to console, and I would like to change it to error level. I tried to use log4j with .properties and .xml configuration and slf4j with .xml configuration. However, I could not override configuration from these libraries. I found many discussions and I tried everything many different options. My configuration looks like:

log4j.xml

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE log4j:configuration SYSTEM "log4j.dtd" >
<log4j:configuration  xmlns:log4j="http://jakarta.apache.org/log4j/"  debug="false">
<appender name="console" class="org.apache.log4j.ConsoleAppender">
    <layout class="org.apache.log4j.PatternLayout">
        <param name="ConversionPattern" value="%5p\t[%d] [%t] %c{3} (%F:%L)     \t%m%n" />
    </layout>
</appender>
<category name="c.d.driver">
    <level value="ERROR" />
    <appender-ref ref="console" />
</category>
<logger name="com.datastax.driver">
    <level value="fatal" />
<!--        <appender-ref ref="console" /> -->
</logger>
<logger name="org.apache.spark">
    <level value="ERROR" />
    <appender-ref ref="console" />
</logger>
<logger name="o.a.s.s">
    <priority value="off" />
<!--    <appender-ref ref="console" /> -->
</logger>
<root>
    <level value="INFO" />
    <appender-ref ref="console" />
</root>
</log4j:configuration>

Is there any solution for this problem?

UPDATE I reached what I wanted to have, but I'm really confused why this happens. First I tried to use log4j in my application with previously described configuration and it didn't work. Then I switched to slf4j with configuration:

logback.xml

<configuration debug="true"> 
 <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender"> 
  <encoder>
  <pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
  </encoder>
 </appender>
 <logger name="org.apache.spark" level="error">
    <appender-ref ref="STDOUT" />
 </logger>
 <root level="ERROR">
   <appender-ref ref="STDOUT" />
</root>

This didn't work as well. As both Spark and Datastax driver are using log4j, I got back to Log4j again, so I removed slf4j loggers from the code, but left it's configuration and maven dependency. When I started with log4j again, in my code log4j.xml configuration is initialized using DOMConfigurator. However, logback.xml is loaded as well, without being explicitly loaded in the code. The loggers for Spark and Datastax specified in this configuration are applied in this way. I don't understand why it didn't work for external libraries when I loaded it using JoranConfigurator. It only affected my code, but not these libraries. So, at the end my solution that works for me looks like:

  • log4j configuration (log4j.xml) loaded from the application and applied to my application
  • slf4j configuration (logback.xml) which is not loaded specifically in my application applies rules to the Apache Spark and Datastax driver for Cassandra.

It would be great if somebody could explain why this happened.

来源:https://stackoverflow.com/questions/28471550/how-to-customize-logging-levels-for-cassandra-and-spark

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!