5

I am trying to write grok pattern for my log file which has three different types of logs, I want to put a filter on the type names (TYPE1,TYPE2,TYPE3) and then write three different grok patterns for this one log file. Also, my log file is a csv separated file.

Log file:
TYPE1,word,word,word,num
TYPE2,word,word,word,word
TYPE3,num,word,num,word

Here's what I have tried so far:

filter {
if [message] =~ /TYPE1/ {
grok {
    match => [ "message", "%{WORD:type},%{WORD:a1"},%{WORD:a2"},%{WORD:a3"},%{POSINT:a4"}]
     }
   }
}

This doesn't work. Also, in this config file i have written grok patterns for other files (which are working well) like:

filter {
    if [type] == "sometype1" or [type] == "sometype2" {
    grok {
    match => [ "message",  "%{POSINT:moduleid}%{SPACE}%{NUMBER:date}"]
         }
      }
   }

And the logfile which is giving me problem has type=sometype3 which I have not mentioned anywhere.

Thanks

user1675386
  • 71
  • 1
  • 1
  • 11

3 Answers3

4

I think you don't need a conditional to do that. If you have static TYPE values ("TYPE1","TYPE2" or "TYPE3") then why not specify one grok pattern for each TYPE:

filter { 
    grok {
        match => { "message" => [ 
                "TYPE1,%{WORD:a1},%{WORD:a2},%{WORD:a3},%{POSINT:a4}",
                "TYPE2,%{WORD:b1},%{WORD:b2},%{WORD:b3},%{WORD:b4}",
                "TYPE3,%{POSINT:c1},%{WORD:c2},%{POSINT:c3},%{WORD:c4}"  ]
            }
    }
} 

I've tried it and it works for your given formats:

TYPE1,word,word,word,num
TYPE2,word,word,word,word
TYPE3,num,word,num,word

A log file would look like this:

TYPE1,a,b,c,4
TYPE2,a,b,c,d
TYPE3,1,b,3,d
hurb
  • 2,177
  • 3
  • 18
  • 32
  • is Logstash efficient enough to pick the right Grok pattern in the first go looking at the type, or will it have to go through each pattern one after the other in the order specified and stop when the right pattern is matched with the log entry? – asgs Sep 26 '18 at 20:50
1

start with successfully pursing one type, for example:

filter {
  if [type] == "sometype1" {
    grok {
      match => [ "message", "%{WORD:type",%{WORD:abc"}]
    }
  }
}

If that is failing you either don't have the type field with the appropriate value in your log data or your grok pattern is not correct.

Verify it using the grok debugger

If you managed to parse one type now try to add the other types as well by adding

if [type] == "sometype1" or [type] == "sometype2" or [type] == "sometype3"

an alternative for this can be

if [type] == "sometype1" {

}
else if [type] == "sometype2" {

}
Tom Kregenbild
  • 1,568
  • 1
  • 10
  • 11
  • I am able to parse multiple log files using the way you mentioned. What I am not able to do is parse one particular log file, for which I have to write different grok patterns based on one keyword (TYPE1/TYPE2/TYPE3) as mentioned above. – user1675386 Jul 28 '15 at 05:03
  • No, you can directly using type filed there is no requirement of conditionals. Only use the `grok` pattern. – Anilkumar Bathula Jul 28 '15 at 05:33
1

In your example, you're using a regular expression to see if you should run a regular expression. That's too much overhead.

Here are two ideas:

First, use grok to pull off the first word into a variable and put the rest of the info back into message:

"%{WORD:myType},%{GREEDYDATA:message}"

(you'll need to use overwrite in that config).

Then you can use exact conditionals to determine which subsequent grok pattern to use:

if [myType] == "type1" {
}
...

Second, it's also possible to list multiple patterns in one grok expression:

match => [ "message", "pattern1", "pattern2", "pattern3" ]

But this is also expensive! (check that syntax against the doc to be sure!).

Alain Collins
  • 16,268
  • 2
  • 32
  • 55