0

I'm new in ELK. In fact, I already installed Logstash, elasticsearch, and kibana on ubuntu 14.04. when I try to test ELK with an existing log file on my ubuntu, the logstash didn't load log into elasticsearch and showing nothing. This is my logstash config file : sudo gedit /etc/logstash/conf.d/logstash.conf input {

file {
path => "/home/chayma/logs/catalina.2016-02-02.log"
start_position => "beginning"
}
}

filter {

grok { 
match => { "message" => "%{COMMONAPACHELOG}" } 
}
}

output {
elasticsearch   {
hosts => [ "127.0.0.1:9200" ] 
}   
stdout          
{
codec => rubydebug
}
}

However, my elasticsearch.yml contains:

cluster.name: my-application

node.name: node-1

node.master: true

node.data: true

index.number_of_shards: 1

index.number_of_replicas: 0

network.host: localhost

http.port: 9200

Please help

Chayma Sakouhi
  • 79
  • 2
  • 10

4 Answers4

0

I presume Logstash and Elasticsearch are installed on same machine and Logstash is running?

   sudo service logstash status

Try checking the Logstash log file to see if it's a connection issue or a syntax error (config looks OK, so probably the former):

   tail -f /var/log/logstash/logstash.log
Roy Rubin
  • 86
  • 4
0

Does your COOMONAPACHELOG matches the log pattern that you are trying to parse using GROK ?

By default from the path on Ubuntu 14.04

/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-2.0.5/patterns/grok-patterns

You can verify the same here

https://grokdebug.herokuapp.com/

The GROK in our case is applying the following regex:

COMMONAPACHELOG %{IPORHOST:clientip} %{HTTPDUSER:ident} %{USER:auth} \[%{HTTPDATE:timestamp}\] "(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})" %{NUMBER:response} (?:%{NUMBER:bytes}|-)

Please provide with the log entries.

Himanshu Chauhan
  • 812
  • 9
  • 11
0

change your elasticsearch output by adding index name to it and try

output {
elasticsearch   {
hosts => [ "127.0.0.1:9200" ] 
index => "testindex-%{+YYYY.MM.dd}"
}   
stdout          
{
codec => rubydebug
}
}
0

You're missing input {}. input{} and output{} are necessary in logstash pipeline.

input {
  file {
      path => "/home/chayma/logs/catalina.2016-02-02.log"
      start_position => "beginning"
    }
  }
}

Or you can check simple way whether text can forward to elasticsearch. Just test with using stdin and stdout in terminal. Be sure local elasticsearch service is running.

input {
    stdin {
        type => "true"
    }
}

filter {
}

output {
    elasticsearch {
        hosts => [ "localhost:9200" ]
    }
    stdout {
        codec => rubydebug
    }
}
funbrain9
  • 503
  • 6
  • 15