5

My time stamp in the logs are in the format as below

2016-04-07 18:11:38.169  which is  yyyy-MM-dd HH:mm:ss.SSS

This log file is not live one (stored/old one), and I am trying to replace this timpestamp with logstash @timestamp value for the betterment in the Kibana Visualization.

My filter in logstash is like below

     grok {
       match => {
            "message" => [ "(?<timestamp>(\d){4}-(\d){2}-(\d){2} (\d){2}:(\d){2}:(\d){2}.(\d){3}) %{SYSLOG5424SD} ERROR u%{BASE16FLOAT}.%{JAVACLASS} - TransId:2b948ed5-12c0-4ae0-9b99-f1ee01191001 - TransactionId ::\"2b948ed5-12c0-4ae0-9b99-f1ee01191001\"- Actual Time taken to process \:\: %{NUMBER:responseTime:int}" ]
            } 

  }

date {
        match => [ "timestamp:date" , "yyyy-MM-dd HH:mm:ss.SSS Z"  ]
        timezone => "UTC"
        target => "@timestamp" 
         } 

But, its not replacing the @timestamp value, Json value

{
  "_index": "logstash-2017.02.09",
  "_type": "logs",
  "_id": "AVoiZq2ITxwgj2avgkZa",
  "_score": null,
  "_source": {
    "path": "D:\\SoftsandTools\\Kibana\\Logs_ActualTimetakentoprocess.log",
    "@timestamp": "2017-02-09T10:23:58.778Z", **logstash @timestamp**
    "responseTime": 43,
    "@version": "1",
    "host": "4637",
    "message": "2016-04-07 18:07:01.809 [SimpleAsyncTaskExecutor-3] ERROR s.v.wsclient.RestClient - TransId:2b948ed5-12c0-4ae0-9b99-f1ee01191001 - TransactionId ::\"2b948ed5-12c0-4ae0-9b99-f1ee01191001\"- Actual Time taken to process :: 43",
    "timestamp": "2016-04-07 18:07:01.809"   **Mine time stamp**
  }

Sample log line -

2016-04-07 18:11:38.171 [SimpleAsyncTaskExecutor-1] ERROR s.v.wsclient.RestClient - TransId:2b948ed5-12c0-4ae0-9b99-f1ee01191001 - TransactionId ::"2b948ed5-12c0-4ae0-9b99-f1ee01191001"- Actual Time taken to process :: 521

Could you please help and let me know, where am I going wring here..

baudsp
  • 4,076
  • 1
  • 17
  • 35
Vishwa
  • 607
  • 2
  • 11
  • 21

2 Answers2

8

You should basically have a grok match in order to use the timestamp of your log line:

grok {
    patterns_dir => ["give your path/patterns"]
    match => { "message" => "^%{LOGTIMESTAMP:logtimestamp}%{GREEDYDATA}" }          
}

In your pattern file make sure to have the patter which matches your timestamp in the log, which could look something like this:

LOGTIMESTAMP %{YEAR}%{MONTHNUM}%{MONTHDAY} %{TIME}

And then once you've done the grok filtering you might be able to use the filtered value like:

mutate {
    add_field => { "newtimestamp" => "%{logtimestamp}" }
    remove_field => ["logtimestamp"]
}
date {
    match => [ "newtimestamp" , "ISO8601" , "yyyy-MM-dd HH:mm:ss.SSS" ]
    target => "@timestamp"  <-- the timestamp which you wanted to apply on
    locale => "en"
    timezone => "UTC"
}

Hope this helps!

Kulasangar
  • 9,046
  • 5
  • 51
  • 82
  • Thanks for response. I do have my grok, apologies.. updated question again. I will try with mutate and update.. thanks again – Vishwa Feb 09 '17 at 10:43
  • Yes please, let me know. – Kulasangar Feb 09 '17 at 10:46
  • And make sure to adjust the *grok* match according to yours. – Kulasangar Feb 09 '17 at 10:46
  • @Vishwa any luck on this? – Kulasangar Feb 13 '17 at 05:11
  • Apologies for delay. I tried the one you suggested.. Could not get through, but sight tweak on that worked. Below is the date filter that worked for me date { match => [ "timestamp" , "ISO8601" ] target => "@ Logtimestamp" locale => "en" timezone => "UTC" } Am getting @ Logtimestamp as new attribute in date field in Kibana Visulization, which I can make use of for plotting graph. Did not use mutate , and my grok is {TIMESTAMP_ISO8601:timestamp} – Vishwa Feb 13 '17 at 12:58
  • You can also copy the logged date/time string to `[@metadata][logtimestamp]` so you don't have to remove the field `logtimestamp` by explicitly specifying `remove_field` clause, since everything in `@metadata` field will NOT be output. – Ham Jan 01 '21 at 17:00
1

you can use date filter plugin of logstash

date {
    match => ["timestamp", "UNIX"]
}
code_code
  • 119
  • 2
  • 11