0

Logstash noob here, I am trying to get these log lines filtered through logstash.

2015-03-31 02:53:39 INFO  This is info message 5

The config file that I am using is this:

input {
  file {
    path => "/sample/log4j_log.log"
    start_position => beginning
  }
}

filter {
  grok {
    match => [ "message" , "%{DATESTAMP:logtimestamp} %{LOGLEVEL:level} %{GREEDYDATA:msg}" ]
  }

  date {
    locale => "en"
    match => [ "logtimestamp" , "yyyy-MM-dd HH:mm:ss" ]
  }
}

output {
  #elasticsearch { host => localhost }
  stdout { codec => rubydebug }
}

The output I get is

         "message" => "2015-03-31 02:53:39 INFO  This is info message 5",
        "@version" => "1",
      "@timestamp" => "0015-03-30T21:00:11.000Z",
            "host" => "abc",
            "path" => "/sample/log4j_log.log",
    "logtimestamp" => "15-03-31 02:53:39",
           "level" => "INFO",
             "msg" => " This is info message 5"

I see that the logtimestamp field is showing the format as "YY-MM-dd HH:mm:ss", I am not sure why it is getting converted tot his format, and I even tried that in the date filter. IN those cases I get this output.

{
         "message" => "2015-03-31 02:53:39 INFO  This is info message 5",
        "@version" => "1",
      "@timestamp" => "2015-04-07T17:55:51.231Z",
            "host" => "abc",
            "path" => "/sample/log4j_log.log",
    "logtimestamp" => "15-03-31 02:53:39",
           "level" => "INFO",
             "msg" => " This is info message 5"
}

In all of this the @timestamp is not matching up with the actual log event timestamp and this causes problems for elastic search + kibana visualization.

I have tried to include target => "@timestamp", locale => "en" as suggested by other questions on StackOverflow with no success.

The only thing I seem to not have tried is : Logstash date parsing as timestamp using the date filter Which i dont believe is fully applicable to my log event.

Community
  • 1
  • 1
Yogesh_D
  • 17,656
  • 10
  • 41
  • 55

1 Answers1

1

You grok pattern is incorrect.

Please change to this, Use TIMESTAMP_ISO8601 instead of DATESTAMP

grok {
    match => [ "message" , "%{TIMESTAMP_ISO8601:logtimestamp} %{LOGLEVEL:level} %{GREEDYDATA:msg}" ]
}

Here is the output:

{
     "message" => "2015-03-31 02:53:39 INFO  This is info message 5",
    "@version" => "1",
  "@timestamp" => "2015-03-30T18:53:39.000Z",
        "host" => "BEN_LIM",
"logtimestamp" => "2015-03-31 02:53:39",
       "level" => "INFO",
         "msg" => " This is info message 5"
}
Ban-Chuan Lim
  • 7,840
  • 4
  • 35
  • 52
  • Thanks! A quick question, the timestamp has moved to 18hrs, I believe that is on account of the timezone differences, so the question is, does logstash infer the current timestamp from the system it is running on? – Yogesh_D Apr 08 '15 at 07:26
  • Ignore the above commend, LogStash does infer the timezone of the machine running the logstash process and uses that to convert the timestamp data to UTC. – Yogesh_D Apr 08 '15 at 16:35
  • Yes, Logstash will convert it to UTC time, I am in +08:00 timezone area. :P – Ban-Chuan Lim Apr 09 '15 at 01:02