0

I'm using logstash to collect logs from my ASA 5505 and i want to extract ip source; ip destination; port source; port destination to use them in kibana. What i should write in the filter.

This is a sample log message:

<166>Aug 20 2014 05:51:34: %ASA-6-302014: Teardown TCP connection 8440 for inside:192.168.2.209/51483 to outside:104.16.13.8/80 duration 0:00:53 bytes 13984 TCP FINs
<166>Aug 20 2014 06:50:55: %ASA-6-305012: Teardown dynamic TCP translation from inside:192.168.2.209/33388 to outside:192.168.1.101/33388 duration 0:04:00
<167>Aug 20 2014 06:50:55: %ASA-7-609002: Teardown local-host outside:74.125.206.95 duration 0:04:00
<166>Aug 20 2014 06:50:55: %ASA-6-305012: Teardown dynamic TCP translation from inside:192.168.2.209/33390 to outside:192.168.1.101/33390 duration 0:04:00
<166>Aug 20 2014 06:50:54: %ASA-6-302014: Teardown TCP connection 10119 for inside:192.168.2.209/48466 to outside:173.194.66.84/443 duration 0:05:34 bytes 3160 TCP FINs
<167>Aug 20 2014 06:50:53: %ASA-7-710005: UDP request discarded from 192.168.1.199/3205 to outside:255.255.255.255/3206

And this is the filter being used:

filter { 
   if [type] == "syslog" { 
     grok { 
       match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" } 
       add_field => [ "received_at", "%{@timestamp}" ] 
       add_field => [ "received_from", "%{host}" ] } 
      syslog_pri { } 
      date { match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ] 
    } 
  } 
} 

Thanks

Alcanzar
  • 16,985
  • 6
  • 42
  • 59
  • What do your logs look like? Have you tried to do this on your own? – Alcanzar Aug 20 '14 at 14:19
  • Well i didn't try because i'm newbie in logstash, also i didn't found my asa on the list in /opt/logstash/pattern/firewalls. this is how my logs look like : {"message":"<166>Aug 20 2014 05:51:34: %ASA-6-302014: Teardown TCP connection 8440 for inside:192.168.2.209/51483 to outside:104.16.13.8/80 duration 0:00:53 bytes 13984 TCP FINs\n","@version":"1","@timestamp":"2014-08-20T14:17:58.452Z","host":"192.168.2.1","tags":["_grokparsefailure"],"priority":13,... – Houcem Ben Smida Aug 20 '14 at 14:25
  • i didn't know where i should start. should i change the pattern or add some field on my filter config on logstash ..? – Houcem Ben Smida Aug 20 '14 at 14:28
  • put a sample of what you've tried (you've tried something to get a _grokparsefailure) and your log sample in the question so that it's easily viewed. Thx. – Alcanzar Aug 20 '14 at 14:28
  • this is my filter : filter { if [type] == "syslog" { grok { match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" } add_field => [ "received_at", "%{@timestamp}" ] add_field => [ "received_from", "%{host}" ] } syslog_pri { } date { match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ] } } } – Houcem Ben Smida Aug 20 '14 at 14:33

2 Answers2

0

You'll want to add something like this in:

grok {
  match => ["syslog_message", 
    "inside:%{HOSTNAME:inside_host}/%{NUMBER:inside_port} to outside:%{HOSTNAME:outside_host}/%{NUMBER:outside_port}",
    "discarded from %{HOSTNAME:inside_host}/%{NUMBER:inside_port} to outside:%{HOSTNAME:outside_host}/%{NUMBER:outside_port}:
  ]
}

before the syslog_pri line.

Basically you'll need to build patterns that match each line type. The above two should match what you have, but if anything comes up as a _grokparsefailure, you'll need to figure out why. One way to do that is using http://grokdebug.herokuapp.com/ (it's how I came up with the pattern in the first place).

Alcanzar
  • 16,985
  • 6
  • 42
  • 59
  • i added the line , i even removed the previous grok but still have the same issue : {"message":"<167>Aug 20 2014 06:32:46: %ASA-7-710005: UDP request discarded from 192.168.1.199/3205 to outside:255.255.255.255/3206\n","@version":"1","@timestamp":"2014-08-20T14:59:11.207Z","host":"192.168.2.1","tags":["_grokparsefailure"],"priority":13,"severity":5,"facility":1,"facility_label":"us... – Houcem Ben Smida Aug 20 '14 at 15:00
  • thanks, but i didn't know wich one i should use (the grok patterns or the firewalls) both under /opt/logstash/patterns. Also all i want to have is the source ip/port and the destination ip/port will be fields on my kibana so i can treat them . – Houcem Ben Smida Aug 21 '14 at 09:14
0

Try this filter

  filter {
  if [type] == "cisco-asa" {
    grok {
      match => ["message", "%{CISCO_TAGGED_SYSLOG} %{GREEDYDATA:cisco_message}"]
    }

   syslog_pri { }

    date {
      match => ["timestamp",
        "MMM dd HH:mm:ss",
        "MMM  d HH:mm:ss",
        "MMM dd yyyy HH:mm:ss",
        "MMM  d yyyy HH:mm:ss"
      ]
      timezone => "America/New_York"
    }

    if "_grokparsefailure" not in [tags] {
      mutate {
        rename => ["cisco_message", "message"]
        remove_field => ["timestamp"]
      }
    }

    grok {
      match => [
        "message", "%{CISCOFW106001}",
        "message", "%{CISCOFW106006_106007_106010}",
        "message", "%{CISCOFW106014}",
        "message", "%{CISCOFW106015}",
        "message", "%{CISCOFW106021}",
        "message", "%{CISCOFW106023}",
        "message", "%{CISCOFW106100}",
        "message", "%{CISCOFW110002}",
        "message", "%{CISCOFW302010}",
        "message", "%{CISCOFW302013_302014_302015_302016}",
        "message", "%{CISCOFW302020_302021}",
        "message", "%{CISCOFW305011}",
        "message", "%{CISCOFW313001_313004_313008}",
        "message", "%{CISCOFW313005}",
        "message", "%{CISCOFW402117}",
        "message", "%{CISCOFW402119}",
        "message", "%{CISCOFW419001}",
        "message", "%{CISCOFW419002}",
        "message", "%{CISCOFW500004}",
        "message", "%{CISCOFW602303_602304}",
        "message", "%{CISCOFW710001_710002_710003_710005_710006}",
        "message", "%{CISCOFW713172}",
        "message", "%{CISCOFW733100}"
      ]
    }
  }
}

That should help