I'm working on adding log analytics for a simple application. The application is a Windows Service app hosted using NSSM and logs are going to ELK stack (Elastic Search + Kibana). Unfortunately, the app does not utilize a logging framework. It logs to console, and NSSM parses the stdout and stderr and creates log files. The problem is that the multiline logs are completely messed up. NSSM puts the multi-line with carriage returns into multiple log statements with time stamps, which causes the logs to be very noisy and fragmented on Kibana.
This is an example of the logs:
[2/1/2022 10:15:39 PM] Host initialized (4321ms)
[2/1/2022 10:15:40 PM] Host started (4342ms)
[2/1/2022 10:15:40 PM] Job host started
Listening on http://localhost:7071/
Hit CTRL-C to exit...
[2/1/2022 10:15:57 PM] Executing HTTP request: {
[2/1/2022 10:15:57 PM] "requestId": "[OMITTED]",
[2/1/2022 10:15:57 PM] "method": "GET",
[2/1/2022 10:15:57 PM] "uri": "/"
[2/1/2022 10:15:57 PM] }
As you can see in the above example, we have multi-line log statements that are turned into multiple log statements with timestamps for each line. Any suggestions on how to make the data less fragmented and format the data, so that it appears cleaner on Kibana?
Obviously the below solutions come to my mind:
- Use a standard logging framework like Log4Net that allows us to specify log severity and have multiline logs per one log statement and configure our logger to
- Use Regex in the filebeat config to fix the multi-line issue. The problem is that I don't have a particular pattern to match on.
- Use logstash for data processing?
Any help would be appreciated. Thanks!