-1

I'm trying to learn ELK . I have setup Filebeat on one host which is forwarding the logs to logstash on some other server which is forwarding logs to elasticsearch.

The logs being forwarded by filebeat are /var/log/messages and /var/log/sa/* /var/log/*.log , /var/log/sample/access.log

When, I'm seeing the messages in kibana, the source is the filename from where the logs are coming but the whole of message is coming in one filed "message" , however I want to display each log data filed in individual label , like in the access logs, we are getting a lot of fields like : Source IP, response code, time taken, byte size etc .. so each should get label with different variable name so that it becomes easy to generate graph with those in timelion

Is there a way in Kibana like splunk, where I use regular expression on any field value and create variable with data out of it and then use the variable to generate graph ?

Thanks in advance for replying.

Thanks for replying

Edit : I tried the below pattern for sar load avg

filter {
if [source] == "sarLoadLog.log" {
      grok {
        match => { "message" => %{GREEDYDATA:time_12} %{NUMBER:runqsz} %{NUMBER:plistsz} %{NUMBER:ldavg1} %{NUMBER:ldavg5} %{NUMBER:ldavg15} %{NUMBER:blocked} }
      }
    }
}

but its not working, I tried on grok debugger and its working there

Below is the data for this

05:36:01 PM         3       300      0.00      0.02      0.05         0
Learner
  • 1,544
  • 8
  • 29
  • 55
  • Below grok pattern worked , earlier spaces were causing the grok pattern to fail grok { match => { "message" => "%{GREEDYDATA:load_time}\s* %{NUMBER:kbmemfree}\s* %{NUMBER:kbmemused}\s* %{NUMBER:memused}\s* %{NUMBER:kbbuffers}\s* %{NUMBER:kbcached}\s* %{NUMBER:kbcommit}\s* %{NUMBER:commit}\s* %{NUMBER:kbactive}\s* %{NUMBER:kbinact}\s* %{NUMBER:kbdirty}\s*" } } – Learner Oct 21 '18 at 17:43
  • maybe you can also try the filter plugin csv and select as separator " " the only thing ... you need to rename the fields afterwards – HauLuk Oct 22 '18 at 06:36

1 Answers1

1

You can achieve your intended outcome using logstash since you already configured it in your setup.

You need to configure filter plugins in logstash to parse the logs into individual fields to be able to aggregate and visualize on those fields. Please see the below link to get started:

https://www.elastic.co/guide/en/logstash/current/filter-plugins.html

Specifically look at the below filter plugin

https://www.elastic.co/guide/en/logstash/current/plugins-filters-grok.html

ben5556
  • 2,915
  • 2
  • 11
  • 16
  • Thanks for replying – Learner Oct 19 '18 at 17:10
  • Thanks for replying, I configured the below pattern in my filter.conf in logstash and it started populating the values in the fields. But the fileds are not searchable yet if "sarMemUtil" in [source] { grok { match => { "message" => "%{GREEDYDATA:load_time}\s* %{NUMBER:kbmemfree}\s* %{NUMBER:kbmemused}\s* %{NUMBER:memused}\s* %{NUMBER:kbbuffers}\s* %{NUMBER:kbcached}\s* %{NUMBER:kbcommit}\s* %{NUMBER:commit}\s* %{NUMBER:kbactive}\s* %{NUMBER:kbinact}\s* %{NUMBER:kbdirty}\s*" } } } – Learner Oct 21 '18 at 17:42
  • Are your fields in mapping set to index: false ? – ben5556 Oct 22 '18 at 07:22