11

I'm not sure the term "bombing" is the right one. By bombing, I mean a log happening many times with the same content (message and args).

For example, a denial of service attach might cause a log warning that an unauthenticated user is trying to access some API. log:

[03-29 11:26:01.000] missing token
[03-29 11:26:01.001] missing token
[03-29 11:26:01.005] missing token
... overall 100000 times

The problem I'm trying to tackle is preventing the log from being very large. Size is a problem. Also, other important messages might not be seen because of the overwhelming number of the repeating message.

I'd like to be able to prevent such a log bombing and have some aggregated message like:

[03-29 11:26:01.000] missing token
[03-29 11:26:06.000] missing token [silenced. Overall 100000 times]

So, I'm looking for 2 features here:

  1. Silence too repetitive logs.
  2. Show a summery of the silenced logs.

Any idea if and how this can be done using logback? Perhaps another logging tool can support this? Thanks.

AlikElzin-kilaka
  • 34,335
  • 35
  • 194
  • 277
  • You can configure the logback to roll the logs and zip it, so that you won't run out of disk space. https://examples.javacodegeeks.com/enterprise-java/logback/logback-rollingfileappender-example/ – alex440 Mar 29 '18 at 08:41
  • 3
    What about fixing the issue, rather than fixing log configuration? – Victor Gubin Mar 29 '18 at 08:45
  • 1
    @alex440 The downsides to that: (1) a flood of duplicate messages could obscure legit messages and cause their deletion (from the files being deleted during rollover); (2) it allows unnecessary disk writes of the duplicate messages. – tony19 Mar 29 '18 at 09:08
  • 1
    @VictorGubin I'd agree, assuming the code could be modified and the fix were feasible. Minimizing the duplicate log noise might be the first step to troubleshooting the actual problem to determine the proper fix. – tony19 Mar 29 '18 at 09:23

3 Answers3

7

The DuplicateMessageFilter, which filters out exact duplicate messages, might fit your first requirement. Currently, the filter stops all duplicate messages after a specified threshold (via allowedRepititions setting), which might not be desirable. You would have to extend the filter if you preferred to have the duplicate-count to reset on new messages or based on time. It does not provide a summary of silenced logs, however.

Example logback config:

<configuration>

  <turboFilter class="ch.qos.logback.classic.turbo.DuplicateMessageFilter" allowedRepetitions="2"/>

  <appender name="console" class="ch.qos.logback.core.ConsoleAppender">
    <encoder>
      <pattern>%date [%thread] %-5level %logger - %msg%n</pattern>
    </encoder>
  </appender>

  <root level="INFO">
    <appender-ref ref="console" />
  </root>  
</configuration>
tony19
  • 125,647
  • 18
  • 229
  • 307
  • 3
    I just quickly checked the source of `DuplicateMessageFilter` on [github](https://github.com/qos-ch/logback/blob/master/logback-classic/src/main/java/ch/qos/logback/classic/turbo/DuplicateMessageFilter.java) and could not find any information about a time frame. So in my opinion it would simply drop all messages after the threshold is hit. The OP should prtovide a custom implementation based on this filter. There is also [a similar filter](http://logback.10977.n7.nabble.com/DuplicateMessageFilter-time-based-duplication-td13059.html) using Guava and time-based eviction as cache implementation. – Franz Fellner Mar 29 '18 at 09:22
  • That's a good point. Indeed, as implemented, all subsequent duplicates are dropped once the threshold is hit, which might not be desirable in the OP's case. The filter should be extended as you suggest, although I think this a bug in the filter. – tony19 Mar 29 '18 at 09:32
  • The downside of `DuplicateMessageFilter` is that it filters out all duplicates. I'd like to have more control over what's being filtered out, for example, per logger or logger instance or logger name. – AlikElzin-kilaka Mar 31 '18 at 10:06
2

I created a generic DuplicateActionFilterByInsertTime class with a small implementation for logging.

Example usage:

DuplicateLogFilter logSilencer = new DuplicateLogFilter(10000);// 10 secs filter
Logger logger = Logger.getLogger(DuplicateActionFilterByInsertTimeTest.class.getName());

logSilencer.log(new LoggerMessage(logger, Level.INFO, "Hello {0}", new Object[]{"Alik"})));

DuplicateLogFilter:

class DuplicateLogFilter extends DuplicateActionFilterByInsertTime<LoggerMessage> {
    DuplicateLogFilter(int filterMillis) {
        super(filterMillis);
        addListener(new DuplicateActionFilterByInsertTime.Listener<LoggerMessage>() {
            @Override
            public void onFilteringFinished(FilteredItem<LoggerMessage> filteredItem) {
                filteredItem.getItem().getLogger().log(Level.INFO, filteredItem.getItem().getMessage() + ". Filtered. Overall " + filteredItem.getSilenceInfo().getCount() + " messages", filteredItem.getItem().getParams());
            }

            @Override
            public void onFilteringStarted(LoggerMessage loggerMessage) {
                loggerMessage.getLogger().log(Level.INFO, loggerMessage.getMessage() + ". Filtering duplicate logs...", loggerMessage.getParams());
            }
        });

    }
    void log(LoggerMessage loggerMessage) {
        run(loggerMessage);
    }
}

Base class that actually does all the filtering

(`DuplicateActionFilterByInsertTime`:
public class DuplicateActionFilterByInsertTime<E extends Runnable> {

    private static final Logger LOGGER = Logger.getLogger(DuplicateActionFilterByInsertTime.class.getName());

    private final long filterMillis;

    private final ConcurrentHashMap<E, FilterInfoImpl> actionMap = new ConcurrentHashMap<>();

    private final ConcurrentLinkedQueue<E> actionQueue = new ConcurrentLinkedQueue<>();

    private final ScheduledExecutorService scheduledExecutorService = Executors.newSingleThreadScheduledExecutor();

    private final AtomicBoolean purgerRegistered = new AtomicBoolean(false);

    private final Set<Listener<E>> listeners = ConcurrentHashMap.newKeySet();

    public DuplicateActionFilterByInsertTime(int filterMillis) {
        this.filterMillis = filterMillis;
    }

    public FilterInfo get(E e) {
        FilterInfoImpl insertionData = actionMap.get(e);
        if (insertionData == null || insertionData.isExpired(filterMillis)) {
            return null;
        }
        return insertionData;
    }

    public boolean run(E e) {
        actionMap.computeIfPresent(e, (e1, insertionData) -> {
            int count = insertionData.incrementAndGet();
            if (count == 2) {
                notifyFilteringStarted(e1);
            }
            return insertionData;
        });
        boolean isNew = actionMap.computeIfAbsent(e, e1 -> {
            FilterInfoImpl insertionData = new FilterInfoImpl();
            actionQueue.add(e1);
            return insertionData;
        }).getCount() == 1;

        tryRegisterPurger();

        if (isNew) {
            e.run();
        }
        return isNew;
    }

    private void tryRegisterPurger() {
        if (actionMap.size() != 0 && purgerRegistered.compareAndSet(false, true)) {
            scheduledExecutorService.schedule(() -> {
                try {
                    for (Iterator<E> iterator = actionQueue.iterator(); iterator.hasNext(); ) {
                        E e = iterator.next();
                        FilterInfoImpl insertionData = actionMap.get(e);
                        if (insertionData == null || insertionData.isExpired(filterMillis)) {
                            iterator.remove();
                        }
                        if (insertionData != null && insertionData.isExpired(filterMillis)) {
                            FilterInfoImpl removed = actionMap.remove(e);
                            FilteredItem<E> filteredItem = new FilteredItem<>(e, removed);
                            notifyFilteringFinished(filteredItem);
                        } else {
                            // All the elements that were left shouldn't be purged.
                            break;
                        }
                    }
                } finally {
                    purgerRegistered.set(false);
                    tryRegisterPurger();
                }
            }, filterMillis, TimeUnit.MILLISECONDS);
        }
    }

    private void notifyFilteringFinished(FilteredItem<E> filteredItem) {
        new Thread(() -> listeners.forEach(l -> {
            try {
                l.onFilteringFinished(filteredItem);
            } catch (Exception e) {
                LOGGER.log(Level.WARNING, "Purge notification failed. Continuing to next one (if exists)", e);
            }
        })).start();
    }

    private void notifyFilteringStarted(final E e) {
        new Thread(() -> listeners.forEach(l -> {
            try {
                l.onFilteringStarted(e);
            } catch (Exception e1) {
                LOGGER.log(Level.WARNING, "Filtering started notification failed. Continuing to next one (if exists)", e1);
            }
        })).start();
    }

    public void addListener(Listener<E> listener) {
        listeners.add(listener);
    }

    public void removeLister(Listener<E> listener) {
        listeners.remove(listener);
    }

    public interface FilterInfo {
        long getInsertTimeMillis();

        int getCount();
    }

    public interface Listener<E> {
        void onFilteringStarted(E e);
        void onFilteringFinished(FilteredItem<E> filteredItem);
    }

    private static class FilterInfoImpl implements FilterInfo {
        private final long insertTimeMillis = System.currentTimeMillis();
        private AtomicInteger count = new AtomicInteger(1);

        int incrementAndGet() {
            return count.incrementAndGet();
        }

        @Override
        public long getInsertTimeMillis() {
            return insertTimeMillis;
        }

        @Override
        public int getCount() {
            return count.get();
        }

        boolean isExpired(long expirationMillis) {
            return insertTimeMillis + expirationMillis < System.currentTimeMillis();
        }
    }

    public static class FilteredItem<E> {
        private final E item;
        private final FilterInfo filterInfo;

        FilteredItem(E item, FilterInfo filterInfo) {
            this.item = item;
            this.filterInfo = filterInfo;
        }

        public E getItem() {
            return item;
        }

        public FilterInfo getFilterInfo() {
            return filterInfo;
        }
    }
}

Source.

AlikElzin-kilaka
  • 34,335
  • 35
  • 194
  • 277
  • Related: [Is there an expiring map in Java that expires elements after a period of time since *first* insertion?](https://stackoverflow.com/a/49598330/435605) – AlikElzin-kilaka Apr 01 '18 at 13:15
0

I think I would prefer to solve the underlying issue that causes the logs to blow, rather then silencing the indication for the issue.

You can configure a separate logger for the class which produces the abusing message and roll it independently of the main log.

  <!-- Turn on debug logging for our HikariCP connection pool. -->
  <logger name="com.zaxxer.hikari" level="DEBUG" />

  <!-- Turn on debug logging for all loggers under com.stubbornjava -->
  <logger name="com.stubbornjava" level="DEBUG" />

https://www.stubbornjava.com/posts/logging-in-java-with-slf4j-and-logback

alex440
  • 1,647
  • 3
  • 20
  • 35