2

Application Logback configuration -

<appender name="stash"
    class="net.logstash.logback.appender.LogstashAccessTcpSocketAppender">
    <destination>localhost:5001</destination>

    <!-- encoder is required -->
    <encoder>
        <pattern>%d{dd/MM/YY HH:mm:ss.SSS} - %-5level[%-5thread] -  %logger{32} - %msg%n</pattern>
    </encoder>
</appender>

Logstash input is TCP plugin and output is ElasticSearch. Initially Logstash server is down and the application is generating logs continuously. When viewed in Kibana, no new logs are getting added. After sometime the logstash is started. Now when logs are viewed in Kibana,it seems all the logs which were generated when logstash was down, is flushed to ES and can be viewed.

I have checked ss | grep 5001 when the logstash server was down, the port 5001 is in CLOSED-WAIT state and queues are empty.

What can be the reason for this?

Cœur
  • 37,241
  • 25
  • 195
  • 267
Arijeet Saha
  • 1,118
  • 11
  • 23

1 Answers1

1

The appender net.logstash.logback.appender.LogstashAccessTcpSocketAppender extends [net.logstash.logback.appender.AbstractLogstashTcpSocketAppender](https://github.com/logstash/logstash-logback-encoder/blob/ master/src/main/java/net/logstash/logback/appender/AbstractLogstashTcpSocketAppender.java) which has an internal ring buffer that buffers the log events. Buffering is required to achieve non blocking behavior. Otherwise the appender would block your code when writing events to the TCP socket.

The ring buffer holds by default 8192 bytes. If the buffer gets full before the events can be send to the socket, the appender starts dropping events. The buffer size and many other properties can be configured via the appender interface.

lapaczo
  • 746
  • 7
  • 10
tautonen
  • 183
  • 2
  • 10