I have a Rocky Linux 8.7 VM with Elasticsearch, Kibana and Logstash (ELK stack using version 7.17). What I am trying to do is to have Logstash receive SNMP oid values and Netflow data from my mikrotik router, transfer them to Elasticsearch and from there to Kibana for visualization.
I created a Logstash config file (mikrotik-router.conf) where the input, filter and output sections look like this:
input {
snmp {
hosts => [{host => "udp:192.168.56.3/161" version => "3"}]
get => ["1.3.6.1.2.1.25.3.3.1.2.1", "1.3.6.1.2.1.25.2.3.1.5.65536", "1.3.6.1.2.1.25.2.3.1.6.65536", "1.3.6.1.2.1.1.3.0", "1.3.6.1.2.1.31.1.1.1.7.1", "1.3.6.1.2.1.31.1.1.1.11.1", "1.3.6.1.2.1.1.1.0"]
security_name => "snmp-v3"
auth_protocol => "md5"
auth_pass => "********"
priv_protocol => "des"
priv_pass => "********"
security_level => "authPriv"
type => "snmp"
}
udp {
port => 9995
codec => netflow {versions => [5, 9]}
type => "netflow"
}
}
filter {
mutate {
convert => {"[netflow][ipv4_src_addr]" => "string"
}
}
geoip {
source => "[netflow][ipv4_src_addr]"
}
}
}
output{
if [type] == "snmp" {
elasticsearch {
hosts => ["192.168.56.102:9200"]
index => "snmp-metrics"
user => "******"
password => "*********"
}
}
if [type] == "netflow" {
elasticsearch {
hosts => ["192.168.56.102:9200"]
index => "logstash-netflow-analytics-%{+YYYY.MM.dd}"
user => "******"
password => "******"
}
}
}
I replaced the real values with asterisks for security reasons.
The problem is that after I've restarted Logstash, I'm getting no value on kibana and the Logstash logs I'm seeing these issues which keeps repeating for longer moments:
[2023-06-06T15:50:58,871][INFO ][logstash.inputs.snmp ][main] using plugin provided MIB path /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/lo gstash-input-snmp-1.3.1/lib/mibs/ietf [2023-06-06T15:51:15,089][INFO ][logstash.inputs.beats ][main] Starting input listener {:address=>"0.0.0.0:5044"} [2023-06-06T15:51:15,212][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"} [2023-06-06T15:51:15,714][INFO ][logstash.inputs.udp ][main][fd39fe2546533fd8031e886bd0154d70cc3877b49c322ae67bfeacb5f24228c4] Starting UDP liste ner {:address=>"0.0.0.0:9995"} [2023-06-06T15:51:16,847][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]} [2023-06-06T15:51:16,911][INFO ][logstash.inputs.udp ][main][fd39fe2546533fd8031e886bd0154d70cc3877b49c322ae67bfeacb5f24228c4] UDP listener start ed {:address=>"0.0.0.0:9995", :receive_buffer_bytes=>"106496", :queue_size=>"2000"} [2023-06-06T15:51:17,201][INFO ][org.logstash.beats.Server][main][19705565b25928a694ac78dbcdd737ef8de9316922e464fa500ecdc386254465] Starting server on port: 5044 [2023-06-06T15:51:35,831][WARN ][logstash.codecs.netflow ][main][fd39fe2546533fd8031e886bd0154d70cc3877b49c322ae67bfeacb5f24228c4] Can't (yet) decode flowset id 256 from source id 0, because no template to decode it with has been received. This message will usually go away after 1 minute. [2023-06-06T15:51:47,068][WARN ][logstash.codecs.netflow ][main][fd39fe2546533fd8031e886bd0154d70cc3877b49c322ae67bfeacb5f24228c4] Can't (yet) decode flowset id 256 from source id 0, because no template to decode it with has been received. This message will usually go away after 1 minute. [2023-06-06T15:51:51,986][WARN ][logstash.codecs.netflow ][main][fd39fe2546533fd8031e886bd0154d70cc3877b49c322ae67bfeacb5f24228c4] Can't (yet) decode flowset id 256 from source id 0, because no template to decode it with has been received. This message will usually go away after 1 minute. [2023-06-06T15:51:54,983][WARN ][logstash.codecs.netflow ][main][fd39fe2546533fd8031e886bd0154d70cc3877b49c322ae67bfeacb5f24228c4] Can't (yet) decode flowset id 256 from source id 0, because no template to decode it with has been received. This message will usually go away after 1 minute. [2023-06-06T15:52:01,986][WARN ][logstash.codecs.netflow ][main][fd39fe2546533fd8031e886bd0154d70cc3877b49c322ae67bfeacb5f24228c4] Can't (yet) decode flowset id 256 from source id 0, because no template to decode it with has been received. This message will usually go away after 1 minute. [2023-06-06T15:52:09,977][WARN ][logstash.codecs.netflow ][main][fd39fe2546533fd8031e886bd0154d70cc3877b49c322ae67bfeacb5f24228c4] Can't (yet) decode flowset id 256 from source id 0, because no template to decode it with has been received. This message will usually go away after 1 minute. [2023-06-06T15:52:16,948][WARN ][logstash.codecs.netflow ][main][fd39fe2546533fd8031e886bd0154d70cc3877b49c322ae67bfeacb5f24228c4] Can't (yet) decode flowset id 256 from source id 0, because no template to decode it with has been received. This message will usually go away after 1 minute. [2023-06-06T15:52:31,939][WARN ][logstash.codecs.netflow ][main][fd39fe2546533fd8031e886bd0154d70cc3877b49c322ae67bfeacb5f24228c4] Can't (yet) decode flowset id 256 from source id 0, because no template to decode it with has been received. This message will usually go away after 1 minute. [2023-06-06T15:52:33,968][WARN ][logstash.codecs.netflow ][main][fd39fe2546533fd8031e886bd0154d70cc3877b49c322ae67bfeacb5f24228c4] Can't (yet) decode flowset id 256 from source id 0, because no template to decode it with has been received. This message will usually go away after 1 minute. ...
[2023-06-06T15:54:29,373][WARN ][logstash.codecs.netflow ][main][fd39fe2546533fd8031e886bd0154d70cc3877b49c322ae67bfeacb5f24228c4] Can't (yet) decode flowset id 256 from source id 0, because no template to decode it with has been received. This message will usually go away after 1 minute.
[2023-06-06T15:54:32,872][WARN ][logstash.codecs.netflow ][main][fd39fe2546533fd8031e886bd0154d70cc3877b49c322ae67bfeacb5f24228c4] Can't (yet) decode flowset id 256 from source id 0, because no template to decode it with has been received. This message will usually go away after 1 minute.
[2023-06-06T15:54:48,979][ERROR][logstash.inputs.snmp ][main][cf1dc62a82d56d4bd2da4d5be4822a32664fc8881c5dffb6ca15bb041208f0cf] error invoking get operation, ignoring {:host=>"192.168.56.3", :oids=>["1.3.6.1.2.1.25.3.3.1.2.1", "1.3.6.1.2.1.25.2.3.1.5.65536", "1.3.6.1.2.1.25.2.3.1.6.65536", "1.3.6.1.2.1.1.3.0", "1.3.6.1.2.1.31.1.1.1.7.1", "1.3.6.1.2.1.31.1.1.1.11.1", "1.3.6.1.2.1.1.1.0"], :exception=>#<LogStash::SnmpClientError: timeout sending snmp get request to target 192.168.56.3/161>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp/base_client.rb:39:in get'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:210:in
block in poll_clients'", "org/jruby/RubyArray.java:1821:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:202:in
poll_clients'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:197:in block in run'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:380:in
every'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-snmp-1.3.1/lib/logstash/inputs/snmp.rb:196:in run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:410:in
inputworker'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:401:in `block in start_input'"]}
[2023-06-06T15:55:47,056][WARN ][logstash.codecs.netflow ][main][fd39fe2546533fd8031e886bd0154d70cc3877b49c322ae67bfeacb5f24228c4] Can't (yet) decode flowset id 256 from source id 0, because no template to decode it with has been received. This message will usually go away after 1 minute.
[2023-06-06T15:55:51,999][WARN ][logstash.codecs.netflow ][main][fd39fe2546533fd8031e886bd0154d70cc3877b49c322ae67bfeacb5f24228c4] Can't (yet) decode flowset id 256 from source id 0, because no template to decode it with has been received. This message will usually go away after 1 minute.
The plugin used is Netflow codec plugin v4.2.2 for logstash 7.17 and elasticsearch 7.17. I went through a lot of comments but I couldn't find any solution.
Could anyone help me resolve this ? It will be much appreciated.
Thanks !