0

I'm trying to use a grok filter in logstash version 1.5.0 to parse several fields of data from a log file.

I'm able to parse a simple WORD field with no issues, but when I try to define a custom pattern and add that in as well, the grok parse fails.

I've tried using a couple of grok debuggers which have been recommended elsewhere to find an issue:

http://grokconstructor.appspot.com/do/match

and

http://grokdebug.herokuapp.com/

both say that my regex should be fine, and return the fields that I want, but when I add it to my logstash.conf, grok fails to parse the log line and simply passes through the raw data to elasticsearch.

My sample line is as follows:

APPERR [2015/06/10 11:28:56.602] C1P1405 S39 (VPTestSlave002_001)| 8000B Connect to CGDialler DB (VPTest - START)| {39/A612-89A0-A598/60B9-1917-B094/9E98F46E} Failed to get DB connection: SQLConnect failed. 08001 (17) [Microsoft][ODBC SQL Server Driver][DBNETLIB]SQL Server does not exist or access denied.

My logstash.conf grok config looks like this:

    grok
    {
        patterns_dir => ["D:\rt\Logstash-1.5.0\bin\patterns"]
        match => {"message" => "%{WORD:LogLevel} \[%{KERNELTIMESTAMP:TimeStamp}\]"}
    }

and the contents of my custom pattern file are:

KERNELTIMESTAMP %{YEAR}/%{MONTHNUM}/%{MONTHDAY} %{HOUR}:?%{MINUTE}(?::?%{SECOND})?%{ISO8601_TIMEZONE}?

I am expecting this to return the following set of data:

{
  "LogLevel": [
    [
      "APPERR"
    ]
  ],
  "TimeStamp": [
    [
      "2015/06/10 11:28:56.602"
    ]
  ],
  "YEAR": [
    [
      "2015"
    ]
  ],
  "MONTHNUM": [
    [
      "06"
    ]
  ],
  "MONTHDAY": [
    [
      "10"
    ]
  ],
  "HOUR": [
    [
      "11",
      null
    ]
  ],
  "MINUTE": [
    [
      "28",
      null
    ]
  ],
  "SECOND": [
    [
      "56.602"
    ]
  ],
  "ISO8601_TIMEZONE": [
    [
      null
    ]
  ]
}

Can anyone tell me where my issue is?

IJBurgess
  • 92
  • 1
  • 1
  • 6
  • I just tried out your config and input, without an issue, could you check if maybe the problem could be related to the path format to your patterns? – Olivier Jun 10 '15 at 14:06
  • Or do you mean you see the whole content also in addition to the specific fields? in which case you can use the mutate -> remove_field option – Olivier Jun 10 '15 at 14:13
  • Also, small comment on expected output, the YEAR, MONTHNUM, MONTHDAY, HOUR, MINUTE, SECOND are names of patterns, not field names, so they won't show up in the event – Olivier Jun 10 '15 at 14:15
  • You're right about the YEAR, MONTH DAY etc... I wasn't expecting them as outputs, just copied the output from the grok debugger. I found my answer anyway. It was a problem with my sample input not being good enough. The spacing after the initial WORD is variable in the real log files, but my regex only accounted for a single space. Solved it by adding a [\s]* inbetween the first two patterns to catch any amount of whitespace. – IJBurgess Jun 11 '15 at 07:58

0 Answers0