0

I am trying to trend the below data over the archived timestamp. I am not sure why my dates and times aren't parsed. According to my grok debugger it works just fine.

https://i.stack.imgur.com/GDFAy.png

Sample Input:

[15/06/02@11:05:31.233-0700] P-007158 T-4131301152 2 WS 4GLTRACE       Run htmAssociate "vsess vsess 1349" [htmOffsets - dpa/setup/vsysadv.w @ 9563]

Config file

input {
file {
    path =>    "/Users/philipp/Documents/Performance/ProductionMetrics/4gltrace_logs/4gltrace_log_bstash.txt"
    start_position => beginning
} 
}
     filter {
     grok {
match => ["message", "\[%{DATE}@%{TIME}-%{INT:TIMEZONE}] %{NOTSPACE:PID} %  {NOTSPACE:T} %{INT:NUM} %{WORD:WS} %{WORD:4GLTRACE} %{GREEDYDATA} \[%{DATA:PROGRAM}]"]
}

}

output { 
    elasticsearch { host => localhost protocol => "http" port => "9200" } 
     stdout { codec => rubydebug } 
    }

I am sure its a silly oversight but not sure where it is. Any help is appreciated.

ppragados
  • 61
  • 2
  • 9
  • What about do you get, and how is that different from what you're expecting? (My guess is you're looking for the date{} filter). – Alain Collins Sep 30 '15 at 15:47
  • Yes, I was hoping to get the date and times of the archived event, but I am actually just getting the day the file was imported to logstash. Just to be more precise: I was hoping to trend along the "15/06.02@12:12:56.451-0700" data – ppragados Sep 30 '15 at 17:33

1 Answers1

0

You need to use the date{} filter in logstash to take a field from your event and replace @timestamp with that value.

If you had a field called my_timestamp with the following format, this would do it:

date {
  match => [ 'my_timestamp', "dd/MMM/yyyy:HH:mm:ss Z" ]
  remove_field => [ 'my_timestamp' ]
}
Alain Collins
  • 16,268
  • 2
  • 32
  • 55
  • I tried updating my config file with the above but since my timestamp format is: "15/06/02@11:05:31.233-0700" I updated the date filter as: date { match => [ 'my_timestamp', "yy/MM/dd@HH:mm:ss.sss-Z" ] remove_field => [ 'my_timestamp' ] } but logstash did not like it. How do I grab the timestamp with the above format? – ppragados Sep 30 '15 at 18:25
  • You have to parse out your original log line into a new field that you can then pass into date{}. It might take grok{} to split it, and then mutate->add_field to recombine it. – Alain Collins Sep 30 '15 at 18:57
  • Hi, so I added the mutate add field to concatenate but I am just getting a literally return. What am I doing wrong? mutate { add_field => { "timestamp" => "%{DATE} %{TIME}" } } – ppragados Oct 01 '15 at 14:49
  • Mutate wants fields, not patterns. In your grok, you need to put the pattern into a field, e.g. %{DATE:myDate}, then you can refer to the myDate field in the mutate. – Alain Collins Oct 01 '15 at 16:01
  • This was my final update, but now successfully changed my archived date/time as the @timestamp: mutate { convert => [ "date", "string" ] convert => [ "time", "string" ] add_field => { "my_timestamp" =>[ "%{date} %{time}"] } } date { match => [ 'my_timestamp', "yy/MM/dd hh:mm:ss.SSS" ] remove_field => [ 'my_timestamp' ] }. Thanks for your guidance Alain! I will update my grok as well, seems much more efficient – ppragados Oct 01 '15 at 17:26