I want to centralize logging on my servers using syslog-ng
which will write a JSON-formatted line to a file, which in turn will be picked up by logstash
, which will forward it to elasticsearch
. This setup works, except for some specific JSON issues.
I format in syslog-ng
the log to be in JSON via a destination
stanza:
destination d_json { file("/var/log/all_syslog_in_json.log" perm(0666) template("{\"@timestamp\": \"$ISODATE\", \"facility\": \"$FACILITY\", \"priority\": \"$PRIORITY\", \"level\": \"$LEVEL\", \"tag\": \"$TAG\", \"host\": \"$HOST\", \"program\": \"$PROGRAM\", \"message\": \"$MSG\"}\n")); };
This usually works fine, but sometimes the JSON ends up malformed due for instance to existing quotes in $MSG
.
Is there a better way to format the message? I was looking at the built-in json-parser
but it looks like it requires key-value pairs as the input, while I would like to explode the available fields into a JSON entry
EDIT & SOLUTION:
I found on Dustin Oprea' blog the exact solution:
destination d_json { file("/tmp/test.json" template("$(format-json --scope selected_macros --scope nv_pairs)\n")); };