0

I want to create ES index based on the dates matching from the logfile. I am using logstash CSV filter to process the logs. For instance, the log data appears like below

2016-02-21 00:02:32.238,123.abc.com,data
2016-02-22 00:04:40.145,345.abc.com,data

Below is the logstash configuration file. Obviously the index will be created as testlog, however, i want the index to be created as testlog-2016.02.21 and testlog-2016.02.22, given that YYYY.mm.dd is the logstash preferred format for index dates. I have done this with grok filters, and I am trying the achieve the same with csv, but this doesn't seem to work.

filter {
 csv {
    columns => [ "timestamp", "host", "data" ]
    separator => ","
    remove_field => ["message"]
    }
}
output {
    elasticsearch {
            hosts => ["localhost:9200"]
            index => "testlog"
    }
}

We are on Logstash 2.1.0, ES 2.1.0 and Kibana 4.3.0 version

Any inputs appreciated

Ganga
  • 883
  • 12
  • 24

1 Answers1

1

You need to specify the @timestamp field in filter, and you also need to specify your index name as below:

filter {
  csv {
    columns => [ "timestamp", "host", "data" ]
    separator => ","
    remove_field => ["message"]
  }

  date {
    match => [ "timestamp", "ISO8601" ] #specify timestamp format
    timezone => "UTC" #specify timezone
    target => "@timestamp"
  }
}

output {
  elasticsearch 
    hosts => ["localhost:9200"]
    index => "testlog-%{+YYYY.MM.dd}"
  }
}
Teddy Ma
  • 1,126
  • 6
  • 12