0

I redirected all the logs(suricata logs here) to logstash using rsyslog. I used template for rsyslog as below:

template(name="json-template"
  type="list") {
    constant(value="{")
      constant(value="\"@timestamp\":\"")     property(name="timereported" dateFormat="rfc3339")
      constant(value="\",\"@version\":\"1")
      constant(value="\",\"message\":\"")     property(name="msg" format="json")
      constant(value="\",\"sysloghost\":\"")  property(name="hostname")
      constant(value="\",\"severity\":\"")    property(name="syslogseverity-text")
      constant(value="\",\"facility\":\"")    property(name="syslogfacility-text")
      constant(value="\",\"programname\":\"") property(name="programname")
      constant(value="\",\"procid\":\"")      property(name="procid")
    constant(value="\"}\n")
}

for every incoming message, rsyslog will interpolate log properties into a JSON formatted message, and forward it to Logstash, listening on port 10514. Reference link: https://devconnected.com/monitoring-linux-logs-with-kibana-and-rsyslog/

(I have also configured logstash as mention on the above reference link)

I am getting all the column in Kibana discover( as mentioned in json-template of rsyslog) but I also require bytes, session and source column in kibana which I am not getting here. I have attached the snapshot of the column I am getting on Kibana here

Available fields(or say column) on Kibana are:

 @timestamp
t @version
t _type
t facility
t host
t message
t procid
t programname
t sysloghost
t _type
t _id
t _index
# _score
t severity

Please let me know how to add bytes, session and source in the available fields of Kibana. I require these parameters for further drill down in Kibana.

EDIT: I have added how my "/var/log/suricata/eve.json" looks like (which I need to visualize in Kibana. )

For bytes, I will use (bytes_toserver+bytes_toclient) which is an available inside flow. Session I need to calculate. Source_IP I will use as the source.

{"timestamp":"2020-05 04T14:16:55.000200+0530","flow_id":133378948976827,"event_type":"flow","src_ip":"0000:0000:0000:0000:0000:0000:0000:0000","dest_ip":"ff02:0000:0000:0000:0000:0001:ffe0:13f4","proto":"IPv6-ICMP","icmp_type":135,"icmp_code":0,"flow":{"pkts_toserver":1,"pkts_toclient":0,"bytes_toserver":87,"bytes_toclient":0,"start":"2020-05-04T14:16:23.184507+0530","end":"2020-05-04T14:16:23.184507+0530","age":0,"state":"new","reason":"timeout","alerted":false}}
  • Most of the interesting data is probably in the `message` field. That tutorial you linked to doesn't go into detail about how to parse the information you want, but if you search for `logstash grok tutorial` or similar, you should find a bunch of walkthroughs. – tomr Jun 07 '21 at 13:51
  • @tomr I have gone through those tutorials but still didn't find the answer that I am looking for. I think I need to add a filter in logstash or modify the rsyslog Syslog to add those parameters but not able to find the exact thing(or similar to it) that I am looking for. – Abhishek Kumar Jun 08 '21 at 11:12

1 Answers1

1

Direct answer

Read the grok docs in detail.

Then head over to the grok debugger with some sample logs, to figure out expressions. (There's also a grok debugger built in to Kibana's devtools nowadays)

This list of grok patterns might come in handy, too.

A better way

Use Suricata's JSON log instead of the syslog format, and use Filebeat instead of rsyslog. Filebeat has a Suricata module out of the box.

Sidebar: Parsing JSON logs

In Logstash's filter config section:

filter {

  json {
    source => "message"
    # you probably don't need the "message" field if it parses OK
    #remove_field => "message"
  }

}

[Edit: added JSON parsing]

tomr
  • 553
  • 4
  • 11
  • To use filebeat do I have to disable rsyslog or I can filebeat with disabling rsyslog? – Abhishek Kumar Jun 08 '21 at 12:22
  • You don't have to disable rsyslog. You probably want to ensure it isn't watching the Suricata JSON log, or at a minimum not forwarding it to logstash/elasticsearch. – tomr Jun 08 '21 at 12:34
  • 2nd process(filebeat one) worked but in place of available fields, I am getting only in the message(I mean source and bytes part that I have mentioned in EDIT part) – Abhishek Kumar Jun 08 '21 at 13:41
  • I think I understand - I've added an example of how to parse the JSON `message` field using Logstash – tomr Jun 08 '21 at 23:21
  • I was sending my output from filebeat directly to elasticsearch. Could you tell, what is the configuration of logstash in your case(not just the filter part)? Any source link will be helpful. – Abhishek Kumar Jun 09 '21 at 09:08
  • There's a great example in the official docs [here](https://www.elastic.co/guide/en/logstash/current/plugins-inputs-beats.html) – tomr Jun 09 '21 at 09:23