1

Lets say we have a Kubernetes cluster (on production) that logs to Logstash.

We want that a specific segment of the logs to be sent to our remote Splunk machine (VM).

The design is to add Splunk forwarder that will collect the logs and send it forward.

We can just run the splunk forwarder in a docker container and expose it has a service inside the k8s cluster.

But I wish to decouple the Log collector from the Splunk forwarder and run a much lighter Log collector like Rsyslog or Syslog-ng inside a sidecar container that will share a volume with the Splunk forwarder (or any other forwarder that might replace it in the future).

Searching for a solution I found some projects (list below) that seems to be not yet production ready.

The decoupling of the log collector seemed like a trivial design pattern to me, but with the lack of a stable official solution I start to wonder maybe there is a reason for that.

From the other hand, maybe more modern (and less lighter) solutions like fluentd took over and left the Syslog legacy solutions behind.

Any ideas?


Rsyslog-docker projects:

https://github.com/rsyslog/rsyslog-docker

https://github.com/jumanjihouse/docker-rsyslog

https://github.com/deoren/rsyslog-docker

https://github.com/camptocamp/docker-rsyslog-bin/blob/master/Dockerfile

https://github.com/megastef/rsyslog-logsene

Syslog-ng-docker projects:

https://github.com/mumblepins-docker/syslog-ng-alpine

https://hub.docker.com/r/balabit/syslog-ng/ <--- Looks like the most stable solution.

Rot-man
  • 18,045
  • 12
  • 118
  • 124
  • syslog-ng can send logs directly to Splunk HEC using its HTTP driver: https://www.syslog-ng.com/community/b/blog/posts/sending-logs-splunk-http A few tips: https://www.syslog-ng.com/community/b/blog/posts/optimize-your-splunk-infrastructure-using-new-syslog-ng-features – MrAnno Aug 04 '19 at 13:16
  • Thanks @MrAnno, but the main focus in my question is How to run Syslog-ng in a container (production ready solution). – Rot-man Aug 04 '19 at 13:18

2 Answers2

1

a little bit late but maybe will help someone.

you can use logstash sidecar to forward logs to splunk using HTTP (Splunk HEC). the logstash container can share volume with other container and follow the log file.

in each new log it will forward it to Splunk. This is an example of logstash.conf file (input & output configuration file of logstash):

input {
 file {
  path => "${LOGS_FILE_PATH}"
  start_position => "beginning"
 }
}

 http {
  http_method => "post"
   url => "http://XX.XX.XX.XX:8088/services/collector/event"
   headers => ["Authorization", "Splunk XXXXX-XXXX-XXXXX-XXXXX"]
   mapping => {
    "event" => "%{[message]}"
   }
 }
}

for more information about configuring logstash: https://www.elastic.co/guide/en/logstash/current/docker-config.html

Sariel
  • 158
  • 1
  • 11
0

RtmY, recently I went through similar situation and Syslog was limited when it comes to Docker and Kubernetes integration.

As you mentioned considering alternatives you should take a look over Fluentbit, it is light (450Kb memory required) and scalable. It does not cover a huge set of plugins but most available are enough for a range of situations.

Some default configuration sets available on official documentation can help you bootstrap.

rsilva
  • 29
  • 4
  • fluentbit doesn't correlate API logs and Application logs. It means that if you forward logs directly from fluentbit, you won't have cluster details in your application logs. – Ali Okan Yüksel Jan 30 '21 at 20:41