1

I need a way to measure how many times different files have been accessed and when. My setup consists of several low capacity servers spread out over the world running Varnish behind Nginx (for SSL support). No content is actually stored on the servers, it's loaded from a central storage point and cached in Varnish for up to an hour.

What I need is to measure how many times a day any given file have been accessed. I figured that this should be quite easy to do with access logs but I need some way to aggregate the log data to a common database.

I have no idea how to best solve this. I am able to script an aggregator in PHP or Python if necessary, and have thought of using that on a FIFO-file or pipe from Nginx but can't figure out how to work it remotely.

1 Answers1

0

You can send the log files to a central log server with syslog-ng.

On the Varnish machines do something like this:

 source s_varnish {
   file("/var/log/varnish.log" flags(no-parse) program_override("varnish"));
 };
 destination d_logserver { 
   tcp("123.456.789.012"); 
 };
 log {
  source(s_varnish);
  destination(d_logserver);
 };

Where 123.456.789.012 is your central log server.

And on your log server you can write all the logs to a single file if you like with something like this:

 source s_all {
   syslog(0.0.0.0);
 };
 destination d_all { 
   file("/var/log/all.log", create_dirs(yes));
 }     
 log {
   source(s_all);
   destination(d_all);
 };
Stone
  • 7,011
  • 1
  • 21
  • 33