1

What is the best & easy way to count logstash events per sec in my environment without using graphite, statsd ?

Thanks!

deez
  • 1,472
  • 3
  • 18
  • 28

2 Answers2

0

Using something like marvel, you can view this kind of data -- although it isn't specific to logstash -- it's all indexes in general. The underlying data is available via the _stats/indexing URL, but you'll need to do some work to make it usable. You'll need to poll it and then calculate the delta and divide by the interval between polls to come up with a per second rate.

For example:

curl -s http://localhost:9200/_stats/indexing

Returns data like this:

"_all" : {
  "primaries" : {
    "indexing" : {
      "index_total" : 98241849,
      "index_time_in_millis" : 23590766,
      "index_current" : 1,
      "delete_total" : 8,
      "delete_time_in_millis" : 4,
      "delete_current" : 0
    }
  },
  "total" : {
    "indexing" : {
      "index_total" : 195892197,
      "index_time_in_millis" : 46639803,
      "index_current" : 2707,
      "delete_total" : 16,
      "delete_time_in_millis" : 14,
      "delete_current" : 0
    }
  }
 ...
Alcanzar
  • 16,985
  • 6
  • 42
  • 59
0

The logstash documentation recommends to use the filter plugin called "metrics", in order to generate this information. I'm, personally, using the output plugin "file", to separate the results from the pipeline

Below is the example provided by the documentation:

input {
  generator {
    type => "generated"
  }
}

filter {
  if [type] == "generated" {
    metrics {
      meter => "events"
      add_tag => "metric"
    }
  }
}

output {
  # only emit events with the 'metric' tag
  if "metric" in [tags] {
    stdout {
      codec => line {
        format => "rate: %{[events][rate_1m]}"
      }
    }
  }
}
Nathan Tuggy
  • 2,237
  • 27
  • 30
  • 38