0

there is a file that has entries like this:

2016-01-22 10:01:44.043, cash.read, 93.67088

2016-01-22 10:01:44.043, cahhe.size, 79

<timestamp>   <metric> <value>

There are 1000's of metrics. Can somebody guide me through creating filter to create entries in logstash to push it to elasticsearch?

Community
  • 1
  • 1
user1471980
  • 10,127
  • 48
  • 136
  • 235
  • Have you met the grok debugger or read any information on learning to use grok? http://svops.com/blog/introduction-to-logstash-grok-patterns/ – Alain Collins Jan 29 '16 at 00:00

1 Answers1

0

You can use grok and kv filters to achive that. The filter will be something like:

filter {
   grok {
        match => {
        message => "%{TIMESTAMP_ISO8601:timestamp},%{SPACE}%{GREEDYDATA:metrics}"
        }
        }

  date {
            match => [ "timestamp", "ISO8601" ]
            remove_field => [ "timestamp" ]
    }


  kv {
        source => "metrics"
        value_split => ","
        trim => " "
        remove_field => [ "metrics", "message" ]
    }

}
jijinp
  • 2,592
  • 1
  • 13
  • 15