How to prevent log bombing using logback?

独自空忆成欢 提交于 2019-12-12 08:11:49

问题


I'm not sure the term "bombing" is the right one. By bombing, I mean a log happening many times with the same content (message and args).

For example, a denial of service attach might cause a log warning that an unauthenticated user is trying to access some API. log:

[03-29 11:26:01.000] missing token
[03-29 11:26:01.001] missing token
[03-29 11:26:01.005] missing token
... overall 100000 times

The problem I'm trying to tackle is preventing the log from being very large. Size is a problem. Also, other important messages might not be seen because of the overwhelming number of the repeating message.

I'd like to be able to prevent such a log bombing and have some aggregated message like:

[03-29 11:26:01.000] missing token
[03-29 11:26:06.000] missing token [silenced. Overall 100000 times]

So, I'm looking for 2 features here:

  1. Silence too repetitive logs.
  2. Show a summery of the silenced logs.

Any idea if and how this can be done using logback? Perhaps another logging tool can support this? Thanks.


回答1:


The DuplicateMessageFilter, which filters out exact duplicate messages, might fit your first requirement. Currently, the filter stops all duplicate messages after a specified threshold (via allowedRepititions setting), which might not be desirable. You would have to extend the filter if you preferred to have the duplicate-count to reset on new messages or based on time. It does not provide a summary of silenced logs, however.

Example logback config:

<configuration>

  <turboFilter class="ch.qos.logback.classic.turbo.DuplicateMessageFilter" allowedRepetitions="2"/>

  <appender name="console" class="ch.qos.logback.core.ConsoleAppender">
    <encoder>
      <pattern>%date [%thread] %-5level %logger - %msg%n</pattern>
    </encoder>
  </appender>

  <root level="INFO">
    <appender-ref ref="console" />
  </root>  
</configuration>



回答2:


I think I would prefer to solve the underlying issue that causes the logs to blow, rather then silencing the indication for the issue.

You can configure a separate logger for the class which produces the abusing message and roll it independently of the main log.

  <!-- Turn on debug logging for our HikariCP connection pool. -->
  <logger name="com.zaxxer.hikari" level="DEBUG" />

  <!-- Turn on debug logging for all loggers under com.stubbornjava -->
  <logger name="com.stubbornjava" level="DEBUG" />

https://www.stubbornjava.com/posts/logging-in-java-with-slf4j-and-logback




回答3:


I created a generic DuplicateActionFilterByInsertTime class with a small implementation for logging.

Example usage:

DuplicateLogFilter logSilencer = new DuplicateLogFilter(10000);// 10 secs filter
Logger logger = Logger.getLogger(DuplicateActionFilterByInsertTimeTest.class.getName());

logSilencer.log(new LoggerMessage(logger, Level.INFO, "Hello {0}", new Object[]{"Alik"})));

DuplicateLogFilter:

class DuplicateLogFilter extends DuplicateActionFilterByInsertTime<LoggerMessage> {
    DuplicateLogFilter(int filterMillis) {
        super(filterMillis);
        addListener(new DuplicateActionFilterByInsertTime.Listener<LoggerMessage>() {
            @Override
            public void onFilteringFinished(FilteredItem<LoggerMessage> filteredItem) {
                filteredItem.getItem().getLogger().log(Level.INFO, filteredItem.getItem().getMessage() + ". Filtered. Overall " + filteredItem.getSilenceInfo().getCount() + " messages", filteredItem.getItem().getParams());
            }

            @Override
            public void onFilteringStarted(LoggerMessage loggerMessage) {
                loggerMessage.getLogger().log(Level.INFO, loggerMessage.getMessage() + ". Filtering duplicate logs...", loggerMessage.getParams());
            }
        });

    }
    void log(LoggerMessage loggerMessage) {
        run(loggerMessage);
    }
}

Base class that actually does all the filtering

(`DuplicateActionFilterByInsertTime`:
public class DuplicateActionFilterByInsertTime<E extends Runnable> {

    private static final Logger LOGGER = Logger.getLogger(DuplicateActionFilterByInsertTime.class.getName());

    private final long filterMillis;

    private final ConcurrentHashMap<E, FilterInfoImpl> actionMap = new ConcurrentHashMap<>();

    private final ConcurrentLinkedQueue<E> actionQueue = new ConcurrentLinkedQueue<>();

    private final ScheduledExecutorService scheduledExecutorService = Executors.newSingleThreadScheduledExecutor();

    private final AtomicBoolean purgerRegistered = new AtomicBoolean(false);

    private final Set<Listener<E>> listeners = ConcurrentHashMap.newKeySet();

    public DuplicateActionFilterByInsertTime(int filterMillis) {
        this.filterMillis = filterMillis;
    }

    public FilterInfo get(E e) {
        FilterInfoImpl insertionData = actionMap.get(e);
        if (insertionData == null || insertionData.isExpired(filterMillis)) {
            return null;
        }
        return insertionData;
    }

    public boolean run(E e) {
        actionMap.computeIfPresent(e, (e1, insertionData) -> {
            int count = insertionData.incrementAndGet();
            if (count == 2) {
                notifyFilteringStarted(e1);
            }
            return insertionData;
        });
        boolean isNew = actionMap.computeIfAbsent(e, e1 -> {
            FilterInfoImpl insertionData = new FilterInfoImpl();
            actionQueue.add(e1);
            return insertionData;
        }).getCount() == 1;

        tryRegisterPurger();

        if (isNew) {
            e.run();
        }
        return isNew;
    }

    private void tryRegisterPurger() {
        if (actionMap.size() != 0 && purgerRegistered.compareAndSet(false, true)) {
            scheduledExecutorService.schedule(() -> {
                try {
                    for (Iterator<E> iterator = actionQueue.iterator(); iterator.hasNext(); ) {
                        E e = iterator.next();
                        FilterInfoImpl insertionData = actionMap.get(e);
                        if (insertionData == null || insertionData.isExpired(filterMillis)) {
                            iterator.remove();
                        }
                        if (insertionData != null && insertionData.isExpired(filterMillis)) {
                            FilterInfoImpl removed = actionMap.remove(e);
                            FilteredItem<E> filteredItem = new FilteredItem<>(e, removed);
                            notifyFilteringFinished(filteredItem);
                        } else {
                            // All the elements that were left shouldn't be purged.
                            break;
                        }
                    }
                } finally {
                    purgerRegistered.set(false);
                    tryRegisterPurger();
                }
            }, filterMillis, TimeUnit.MILLISECONDS);
        }
    }

    private void notifyFilteringFinished(FilteredItem<E> filteredItem) {
        new Thread(() -> listeners.forEach(l -> {
            try {
                l.onFilteringFinished(filteredItem);
            } catch (Exception e) {
                LOGGER.log(Level.WARNING, "Purge notification failed. Continuing to next one (if exists)", e);
            }
        })).start();
    }

    private void notifyFilteringStarted(final E e) {
        new Thread(() -> listeners.forEach(l -> {
            try {
                l.onFilteringStarted(e);
            } catch (Exception e1) {
                LOGGER.log(Level.WARNING, "Filtering started notification failed. Continuing to next one (if exists)", e1);
            }
        })).start();
    }

    public void addListener(Listener<E> listener) {
        listeners.add(listener);
    }

    public void removeLister(Listener<E> listener) {
        listeners.remove(listener);
    }

    public interface FilterInfo {
        long getInsertTimeMillis();

        int getCount();
    }

    public interface Listener<E> {
        void onFilteringStarted(E e);
        void onFilteringFinished(FilteredItem<E> filteredItem);
    }

    private static class FilterInfoImpl implements FilterInfo {
        private final long insertTimeMillis = System.currentTimeMillis();
        private AtomicInteger count = new AtomicInteger(1);

        int incrementAndGet() {
            return count.incrementAndGet();
        }

        @Override
        public long getInsertTimeMillis() {
            return insertTimeMillis;
        }

        @Override
        public int getCount() {
            return count.get();
        }

        boolean isExpired(long expirationMillis) {
            return insertTimeMillis + expirationMillis < System.currentTimeMillis();
        }
    }

    public static class FilteredItem<E> {
        private final E item;
        private final FilterInfo filterInfo;

        FilteredItem(E item, FilterInfo filterInfo) {
            this.item = item;
            this.filterInfo = filterInfo;
        }

        public E getItem() {
            return item;
        }

        public FilterInfo getFilterInfo() {
            return filterInfo;
        }
    }
}

Source.



来源:https://stackoverflow.com/questions/49551480/how-to-prevent-log-bombing-using-logback

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!