What is the best & easy way to count logstash events per sec in my environment without using graphite, statsd ?
Thanks!
Using something like marvel, you can view this kind of data -- although it isn't specific to logstash -- it's all indexes in general. The underlying data is available via the _stats/indexing
URL, but you'll need to do some work to make it usable. You'll need to poll it and then calculate the delta and divide by the interval between polls to come up with a per second rate.
For example:
curl -s http://localhost:9200/_stats/indexing
Returns data like this:
"_all" : {
"primaries" : {
"indexing" : {
"index_total" : 98241849,
"index_time_in_millis" : 23590766,
"index_current" : 1,
"delete_total" : 8,
"delete_time_in_millis" : 4,
"delete_current" : 0
}
},
"total" : {
"indexing" : {
"index_total" : 195892197,
"index_time_in_millis" : 46639803,
"index_current" : 2707,
"delete_total" : 16,
"delete_time_in_millis" : 14,
"delete_current" : 0
}
}
...
The logstash documentation recommends to use the filter plugin called "metrics", in order to generate this information. I'm, personally, using the output plugin "file", to separate the results from the pipeline
Below is the example provided by the documentation:
input {
generator {
type => "generated"
}
}
filter {
if [type] == "generated" {
metrics {
meter => "events"
add_tag => "metric"
}
}
}
output {
# only emit events with the 'metric' tag
if "metric" in [tags] {
stdout {
codec => line {
format => "rate: %{[events][rate_1m]}"
}
}
}
}