0

I am trying to import data from MySQL to elasticsearch using logstash, everything works fine and I have all the data imported well. However, one of the fields in MySQL called "metadata" is following a specific pattern like this "firstname_lastname_yyyy-MM-dd HH:mm:ss" so for example this is one of the values it may take "Mark_Karlos_2018-02-23 15:19:55", at the moment this field is imported to Elasticsearch as it is, what I want to do is to have this field as three fields in Elasticsearch "first_name", "last_name", "time". Is this possible to be done with Logstash config file? If not is there any other way to do this?

1 Answers1

0

You can use the grok filter:

grok {
    match => {"metadata"=> "%{GREEDYDATA:first_name}_%{GREEDYDATA:last_name}_%{TIMESTAMP_ISO8601:time}"}
}

To help you with the grok filter:

Official documentation

Existing patterns

To test your patterns

baudsp
  • 4,076
  • 1
  • 17
  • 35
  • Thank you, it works! but what if the datetime has a custom format let's say my field looks like this "firstname_lastname_yyyyMMdd_HHmmss" then using the same example in the question "Mark_Karlos_20180223_151955" how can I modify the last part to read this "custom" datetime format? – m.alsioufi Mar 02 '18 at 09:17
  • You can use `GREEDYDATA` instead, like this: `%{GREEDYDATA:time}` – baudsp Mar 02 '18 at 10:29