0

This is the naming convention of my log files which looks like this:

adminPortal-2021-10-10.0.log
adminPortal-2021-10-27.0.log

I need to publish them to different indices that match the log file date, but Elasticsearch publishes logs from all log files into one index.

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "admin-%{+YYYY-MM-dd}"
  }
}
karel
  • 5,489
  • 46
  • 45
  • 50
Ali farahzadi
  • 274
  • 2
  • 10
  • To which index does this save records to? It seems like date and time used here is current time. So if you have your records in index `admin-2021-11-02` or whatever was the day you ran this on, then you have your answer to that. And if you want to have the date from log file name then you should use [this](https://stackoverflow.com/questions/22916200/logstash-how-to-add-file-name-as-a-field). – Filip Nov 02 '21 at 12:13
  • Hi sir , they are my old logs that I added manually .but all logs from both file integrated in one index (actually today time).but I one the index matching the log file date. tnx – Ali farahzadi Nov 02 '21 at 14:32
  • I have no idea what you just said, sorry. – Filip Nov 02 '21 at 17:55

1 Answers1

2

A sprintf reference to a date, like %{+YYYY-MM-dd} always uses the value of the @timestamp field. If you want it to use the value from the log entry you will need to parse the timestamp out of the [message] field, possibly using grok, and then parse that using a date filter to overwrite the default value of the @timestamp field (which is Time.now).

Badger
  • 3,943
  • 2
  • 6
  • 17
  • Exactly right.so my problem is with timestamp pharse. what is wrong in this code sir that I put below: grok { match => { "message" => "(?%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}) %{LOGLEVEL:logLevel} %{JAVACLASS:className} \[(?[A-Za-z0-9_-]+)\] %{GREEDYDATA:API_request_and_response}" } } date { match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss.SSS" ] target => "@timestamp" } – Ali farahzadi Nov 03 '21 at 08:55