-1

I'd like to monitor on my Zabbix server the number of concurrent PostgreSQL connections, so I've created a cron job that outputs the COUNT of rows on the pg_stat_activity to a file, that will be read by zabbix once a minute.

My problem is that I might have a scenario where I get a COUNT of, say, 10, then have a quick peak of 50 connections, get back to 10, where I do the COUNT again. In this case the peak would not be noticed.

I've wondered about some counter being inc/dec each connection/disconnection, but failed to think how to do this. Also I could do the COUNT on a higher frequency and keep an average per minute, but this not solve the problem.

Any thougts in that matter?

Thanks, Gabriel

1 Answers1

1

Use log files. Here is a quick tutorial for Linux.

1)

Find out where if postgres.conf file located:

postgres=# show config_file;
┌──────────────────────────────────────────┐
│               config_file                │
├──────────────────────────────────────────┤
│ /etc/postgresql/9.5/main/postgresql.conf │
└──────────────────────────────────────────┘

2)

Find and edit parameters in it (store the copy somewhere before):

log_connections = on
log_disconnections = on
log_destination = 'csvlog'
logging_collector = on

3)

Restart PostgreSQL:

sudo service postgresql restart

(not sure but probably sudo service postgresql reload will be enough)

4)

Find out where the logs stored:

postgres=# show data_directory; show log_directory;
┌──────────────────────────────┐
│        data_directory        │
├──────────────────────────────┤
│ /var/lib/postgresql/9.5/main │
└──────────────────────────────┘

┌───────────────┐
│ log_directory │
├───────────────┤
│ pg_log        │
└───────────────┘

5)

Almost done. In files /var/lib/postgresql/9.5/main/pg_log/*.csv you will find records about connections/disconections. It is up to you how to deal with this info.

Abelisto
  • 14,826
  • 2
  • 33
  • 41