I am trying to index log files to Elastic search. All the log entries are being indexed into a field named message. @timestamp field shows the time the entry was indexed and not the timestamp from log entry.
I created a ingest pipeline with grok processor to define the pattern of the log entry. I have tried several patterns and am unable to get this working, particularly because i am new to grok.
Log sample
2019-08-05 00:04:06 info [index.js]: Request: HTTP GET /
2019-08-05 00:04:06 error [error.js]: No authorization token was found
Ingest pipeline with grok & date processor
"description" : "Extracting date from log line"
, "processors": [
{
"grok": {
"field": "message",
"patterns": ["%{yyyy-mm-dd HH:mm:ss:logtime} %{LOGLEVEL:loglevel} %{GREEDYDATA:message}"]
},
"date": {
"field": "logtime",
"target_field": "@timestamp",
"formats": ["yyyy-mm-dd HH:mm:ss"]
}
}
]
}
All i want is the ability to extract the timestamp from the log message and everything else can be ignored or wildcarded or stored in just one variable like message. So essentially indexing the log file should index the timestamp from the log message and rest of the message can stay as text or string in one field, no need to parse rest of the message.
Any help would be appreciated.