2

Log aggregation tools like ELK stack seems to be de facto solution in microservices monitoring space. Microservices writes their logs to files, which are collected and forwarded by the host machine collector agents.

To be honest, I don't see many benefits in this model. Log files are not confidentally managed and they may get lost or manipulated on their way.

Using dedicated logging microservice API to collect logs would enable all the goods of well-defined, secure communication and data protection without all the overhead of configuring log aggregation tool.

Why should I use log aggregation tool instead of dedicated logging service?

Tuomas Toivonen
  • 21,690
  • 47
  • 129
  • 225
  • Wouldn't your custom logging service just be like a custom logstash-plugin that you could deploy as part of the ELK stack? And have you read the existing material on ELK stack security? For example https://www.elastic.co/what-is/elastic-stack-security – Oswin Noetzelmann Jan 01 '23 at 02:53

1 Answers1

1

In addition to what @Oswin has commented above, the concern regarding loss of log file or manipulation of events in transit or at rest can also happen with your logging microservice. Essentially you are manually writing the code for logstash/ filebeat where the whole elastic stack can be configured for SSL easily.

AFA log files manipulations are concerned, the access rights of users are generally managed by RBAC, hence would suggest to review that as well. Be assured, filebeat and logstash only open the files in RO mode, and only depending on your processors/ pipelines, the events will be manipulated before being sent to ES for ingestion.

When you talk about missing log files, it could be due to container evictoins/ terminations as well, in which case even your logging microsevice won't be able to access them.

Ayush
  • 326
  • 1
  • 5