2

(This is for Zabbix 2.2.2.)

I have a discovery rule that can return 100+ items that I'd like to monitor. Each item's data can be gathered by a UserParameter invoking a separate executable.

Implementing these as passive checks eats the zabbix agent alive, spawning an external process for each item.

The data for all the items could easily be collected all at once and sent to the Zabbix server in bulk (via zabbix_sender). I don't know how to implement this properly. The documentation doesn't answer the (potentially stupid) questions I have.

How can I have the Zabbix agent make a single external invocation of my custom data-gathering script and send back all the data in bulk for the 100+ discovered items as well as not invoking a client-side action for every item? What I want to avoid is having the Zabbix agent attempt to collect each item individually.

The discovery rule has to create the items. As I understand it, each item will result in the zabbix agent attempting to do something to collect its data. Is there a way I can prevent this or associate a group of items with a single client-side active check?

Part of what I don't understand is how active checks are supposed to be implemented. If one active check can send back a batch of items' data, how would the item be defined for that host (so that the agent didn't attempt to gather the item's data individually) and what item would be defined to invoke the active check (or whatever item type should be used) that would do the actual data gathering (but not save any data itself)? This is the real question that the documentation doesn't answer for me. How does Zabbix intend for me to implement active checks or use zabbix_sender to send bulk updates for items (that the agent will attempt to gather on its own).

mojo
  • 227
  • 1
  • 4
  • 12
  • I think you either have to have 100 separate checks, or one "entity" that checks 100 things at once and is considered up/down for the whole bunch. Don't quote me on that, though. – Hyppy Jan 22 '15 at 20:52
  • What kind of system are you gathering data from? I do something similar with scripts on the client side being run as scheduled tasks to gather and send multiple values. If that is an option for you I can post a detailed answer. – Grant Jan 22 '15 at 22:00
  • @Grant It's on a Windows machine. I was hoping to avoid scheduled tasks because they won't allow for remote control from zabbix if I want to, say, change how frequently the data is gathered. – mojo Jan 23 '15 at 04:51
  • You are probably looking for - https://support.zabbix.com/browse/ZBXNEXT-3006 – ritzk Jan 29 '18 at 14:32

1 Answers1

3

Basic Zabbix concept - one item (check) = one value. Zabbix server doesn't provide parser functionality to parse multiple values, so it has to be always only one value.

If you want to return more value, than you have to use some workarounds, see zabbix UserParameter return 2 or more values

Zabbix sender example:

allow command execution in the zabbix-agent and create item (active or passive) to run your script:

system.run["myscript.sh > output_for_zabbix_sender.txt; zabbix_sender -s <host_name> -z <zabbix_server> -i output_for_zabbix_sender",nowait]

It will execute command in nowait mode. myscript.sh produces output in format:

hostname key1 value1
hostname key2 value2
...

And this output is processed (sent) to the zabbix server by zabbix_sender. Item type must be Zabbix trapper in this case (zabbix_sender). Eventually you don't need system.run item, you can execute script with zabbix_sender as cron job.

If you need perfect monitoring solution don't forget to handle errors, minimize IOPs, ...

You can use zabbix_sender also for discovering - just follow documentation about required format, e.g.:

hostname discovery_key {"data":[{"{#ID}": "/"},{"{#ID}":"/usr"},{"{#ID}":"/var"}]}
Jan Garaj
  • 879
  • 1
  • 7
  • 15
  • Another method (the one I chose to go with) is very similar to this. I created a single boolean item that invoked a UserParameter script that did all the data gathering (for the multitude of items) and submitted the data via zabbix_sender and returned success/failure to Zabbix. It's a little wasted space (storing the success/failure of the aggregation/send process), but it allows me to control the frequency of the queries from the Zabbix web app instead of modifying a scheduled task on each Windows machine. – mojo Jan 26 '15 at 15:46
  • I guess the base answer to my question (how is this supposed be used) is: use item type `trapper` for discovered items and run a cron job/scheduled task on the server to gather/send the data. – mojo Jan 26 '15 at 15:48