1

Currently
I have completed the above task by using one log file and passes data with logstash to one index in elasticsearch :

yellow open logstash-2016.10.19 5 1 1000807 0 364.8mb 364.8mb

What I actually want to do

If i have the following logs files which are named according to Year,Month and Date

MyLog-2016-10-16.log
MyLog-2016-10-17.log
MyLog-2016-10-18.log
MyLog-2016-11-05.log
MyLog-2016-11-02.log
MyLog-2016-11-03.log

I would like to tell logstash to read by Year,Month and Date and create the following indexes :

yellow open MyLog-2016-10-16.log
yellow open MyLog-2016-10-17.log
yellow open MyLog-2016-10-18.log
yellow open MyLog-2016-11-05.log
yellow open MyLog-2016-11-02.log
yellow open MyLog-2016-11-03.log

Please could I have some guidance as to how do i need to go about doing this ?

Thanks You

adz
  • 111
  • 3
  • 14

3 Answers3

0

It is also simple as that :

 output {
   elasticsearch {
      hosts => ["localhost:9200"]
      index => "MyLog-%{+YYYY-MM-DD}.log"
   }
 }
Renaud Michotte
  • 389
  • 4
  • 17
0

If the lines in the file contain datetime info, you should be using the date{} filter to set @timestamp from that value. If you do this, you can use the output format that @Renaud provided, "MyLog-%{+YYYY.MM.dd}".

If the lines don't contain the datetime info, you can use the input's path for your index name, e.g. "%{path}". To get just the basename of the path:

mutate {
    gsub => [ "path", ".*/", "" ]
}
Alain Collins
  • 16,268
  • 2
  • 32
  • 55
0

wont this configuration in output section be sufficient for your purpose ??

output {
    elasticsearch {
        embedded => false
        host => localhost
        port => 9200
        protocol => http
        cluster => 'elasticsearch'
        index => "syslog-%{+YYYY.MM.dd}"
    }
}
akshaya pandey
  • 997
  • 6
  • 16