3

I am using Bunyan and bunyas-lumberjack to send my logs to log stash and index them in elastic search. The problem I am facing is when I am filtering the logs: I am using a basic filter for Logstash :

filter {
 if [type == "json"]{
        json {
                source => "message"
        }
   }
}

that puts the JSON from bunyan into the source.message field and indexes it in elastic search. How can I index every field from bunyan into a particular elastic search field so I can search over it or use it in Kibana ?

I am attaching what I have obtained now and what I want to obtain as example. Currently:

{
  "_index": "logstash-2015.10.26",
  "_type": "json",
  "_id": "AVCjvDHWHiX5VLMgQZIC",
  "_score": null,
  "_source": {
    "message": "{\"name\":\"myLog\",\"hostname\":\"atnm-4.local\",\"pid\":6210,\"level\":\"error\",\"message\":\"This should work!\",\"@timestamp\":\"2015-10-26T10:40:29.503Z\",\"tags\":[\"bunyan\"],\"source\":\"atnm-4.local/node\"}",
    "@version": "1",
    "@timestamp": "2015-10-26T10:40:31.184Z",
    "type": "json",
    "host": "atnm-4.local",
    "bunyanLevel": "50"
  },

Wanted:

{
  "_index": "logstash-2015.10.26",
  "_type": "json",
  "_id": "AVCjvDHWHiX5VLMgQZIC",
  "_score": null,
  "_source": {
    "message": {
      "name": example,
      "hostname": example,
      "etc": example
oguz ismail
  • 1
  • 16
  • 47
  • 69
alexsc
  • 1,196
  • 1
  • 11
  • 21
  • It seems that by adding the codec as json in lumberjack everything will work well : codec => json , but I don't think I can still index syslogs, right ? – alexsc Oct 26 '15 at 10:58

1 Answers1

1

Each input in logstash can have different codec and type. In your case, if you want to index bunyan and syslog, you'll have two inputs with two different types. The syslog will have codec "plain", the bunyan will have "json". You do not need any filter for the bunyan messages. The json will be parsed and the fields will appear automagically. You will have to have a filter to parse the syslog input.