0

I'm building a log analysis environment with the purpose of analyzing linux logs such as: /var/log/auth.log, /var/log/cron, /var/log/syslog, etc. The goal is to be able to upload such a log file and analyze it properly with Kibana/Elasticsearch. To do so, I created a .conf file as seen below, which includes the proper patterns to pars auth.log and the information needed in the input and output section. Unfortunately, when connecting to Kibana I cannot see any data in the "Discover" panel and cannot find the related "index pattern". I tested the grokk pattern and they works well.

input {
  file {
    type => "linux-auth"
    path => [ "/home/ubuntu/logs/auth.log"]
  }
filter {
  if [type] == "linux-auth" {
      grok {
        match => { "message" => "%{TIMESTAMP_ISO8601:time} %{WORD:method}\[%{POSINT:auth_pid}\]\: %{DATA:message} for %{DATA:user} from %{IPORHOST:IP_address} port %{POSINT:port}" }
      }
      grok {
        match => { "message" => "%{TIMESTAMP_ISO8601:time} %{WORD:method}\[%{POSINT:auth_pid}\]\:%{DATA:message} for %{GREEDYDATA:username}" }
      }
  }
}
output{
    elasticsearch {
        hosts => "elasticsearch:9200"
    }
}

Example of auth.log:

2018-12-02T14:01:00Z sshd[0000001]: Accepted keyboard-interactive/pam for root from 185.118.167.241 port 64965 ssh2
2018-12-02T14:02:00Z sshd[0000002]: Failed keyboard-interactive/pam for invalid user ubuntu from 36.104.140.175 port 57512 ssh2
2018-12-02T14:03:00Z sshd[0000003]: pam_unix(sshd:session): session closed for user root
Humbur
  • 1
  • 2

1 Answers1

0

Here is the few recommendations which i would like to give:

  1. You can run logstash on debug mode like below to check what is the exact error.
bin/logstash --debug -f file_path.conf
  1. Check with stdout in output section which will print the incoming data. So that you will be sure that logstash reading the file correctly.
  2. The most important as you mention you want to read system log and need to visualize the data, I would recommend to use filebeat with system modules. Filebeat is especially build for such use cases like reading from file.

It is simple setup where in filebeat under the system module you just need to specify which system log file you need read. Mention the Elasticsearch endpoint and run the filebeat.

It will start reading and pushing the data to the elasticsearch.

Also You don't need build the custom dashboard in kibana (As you going to build in case of logstash). Filebeat comes with pre configured dashboards for system logs.

You can check more on above official document.

Ashish Tiwari
  • 1,919
  • 1
  • 14
  • 26
  • Thank you Ashish. I installed the filebeat with system module, but the logs I see in Kibana are logs from the host server itself (where I installed the ELK stack); I want to see in Kibana logs which I imports into the server. For doing so, I created a directory under /home/ubuntu/logs and saved there an "auth.log" file. I also changed the input section of the filebeat.yml as seen below. Any idea why I still don't see the imported logs I saved under /home/ubuntu/logs , and see only the system logs of my server (which I dont care about). – Humbur May 15 '22 at 13:04
  • # ============================== Filebeat inputs =============================== filebeat.inputs: # Each - is an input. Most options can be set at the input level, so # you can use different inputs for various configurations. # Below are the input specific configurations. # filestream is an input for collecting log messages from files. - type: filestream # Change to true to enable this input configuration. enabled: true # Paths that should be crawled and fetched. Glob based paths. paths: - /home/ubuntu/logs/*.log – Humbur May 15 '22 at 13:04