1

I have an ELK Search running in my environment (CentOS 7). The whole process seems to occur correctly, however, the logs sent by filebeat are not being parsed by logstash.

# logstash input

input {
  beats {
    host => ["LogstashIP"]
    port => 5044
  }
}

# httpd filter

filter {
  grok {
    match => { "message" => "%{COMBINEDAPACHELOG}" }
  }
}
# logstash output

output {
  elasticsearch {
    hosts => ["elasticSearchIP:9200"]
    manage_template => false
    index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
  }
}
# filebeat output

filebeat:
  prospectors:
    -
      paths:
        - /var/log/httpd/application.log*

      input_type: log

      document_type: httpd

  registry_file: /var/lib/filebeat/registry

output:
  logstash:
    hosts: ["LogstashIP:5044"]
    bulk_max_size: 1024

shipper:

logging:
  files:
    rotateeverybytes: 10485760 # = 10MB

On Kibana discover menu, i can't access information like geoip, status response,request method... Only the full message log, this way i also can't create dashboards with those data

Can anyone help me ?

euduzz
  • 68
  • 4
  • 1
    Check the tags of the messages in Kibana. It there's a tag _grokparsefailure, it means that the logs can't be parsed by your grok pattern. Also you might want to refresh your field list (see https://stackoverflow.com/q/30471859/6113627) – baudsp Feb 26 '19 at 16:51
  • Hi, the "_grokparsefailure" tag does not appear on logs. I also refresh the field list, but still unsuccessful – euduzz Feb 27 '19 at 12:59
  • Hi. I am facing very similar issue with COMBINEDAPACHELOG filter in Centos right now. I am running 3 logstash instances version 6.4.1 for load balancing and only one of them is parsing the messages. – antrost Mar 06 '19 at 11:24

0 Answers0