Here is the practice that we use in some of our java based stack and it has many privileges for the usage of Apache Kafka as middle data pipe line and logstash as data ingestion pipeline.
First you need to remove default providers for logs in your spring boot application inside your pom.xml file, Which are Logback and perhaps Log-classic then you need to add log4j2 as new log provider and adding Kafka appender. After adding dependencies you need xml configuration file where you can add your Kafka appender configurations. By default you need to locate your configuration file in resource path of your project and name it as "log4j2.xml".
You can find many others Log4j2 appenders like Cassandra or Failover appenders and add them beside your Kafka appender inside your configuration file. You can find an applicable and correct example in below.
<!--excluding logback -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
<exclusions>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-logging</artifactId>
</exclusion>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>logback-classic</artifactId>
</exclusion>
</exclusions>
</dependency>
<!--adding log4j2 and kafka appender-->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-log4j2</artifactId>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-log4j-appender</artifactId>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
</exclusions>
</dependency>
Kafka appender configuration
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="info" name="kafka-appender" packages="Citydi.ElasticDemo">
<Appenders>
<Kafka name="kafkaLogAppender" topic="Second-Topic">
<JSONLayout />
<Property name="bootstrap.servers">localhost:9092</Property>
<MarkerFilter marker="Recorder" onMatch="DENY" onMismatch="ACCEPT"/>
</Kafka>
<Console name="stdout" target="SYSTEM_OUT">
<PatternLayout pattern="%d{HH:mm:ss.SSS} stdout %highlight{%-5p} [%-7t] %F:%L - %m%n"/>
<MarkerFilter marker="Recorder" onMatch="DENY" onMismatch="ACCEPT"/>
</Console>
</Appenders>
<Loggers>
<Root level="INFO">
<AppenderRef ref="kafkaLogAppender"/>
<AppenderRef ref="stdout"/>
</Root>
<Logger name="org.apache.kafka" level="warn" />
</Loggers>
</Configuration>
Activating Zookeeper broker
./zookeeper-server-start.sh ../config/zookeeper.properties
Activating Kafka broker
./kafka-server-start.sh ../config/server.properties
Create Topic
./kafka-topics.sh --create --topic test-topic -zookeeper localhost:2181 --replication-factor 1 --partitions 4
Active consumer of the created topic
./kafka-console-producer.sh --broker-list localhost:9092 --topic test-topic
Then add the log appender for created topic for consuming logs(This one is up to you) and after that create a Logstash pipeline such as below configuration as ingest your logs into your desired index in elastic .
input {
kafka{
group_id => "35834"
topics => ["yourtopicname"]
bootstrap_servers => "localhost:9092"
codec => json
}
}
filter {
}
output {
file {
path => "C:\somedirectory"
}
elasticsearch {
hosts => ["localhost:9200"]
document_type => "_doc"
index => "yourindexname"
}
stdout { codec => rubydebug
}
}