0

I have about 1000 sites on multiple windows clusters. The IIS log files(text documents) are available for each site in a set location on the server. The solution what I am looking at should be able to do following things.

1) Push the log files into the cloud. 2) Read those log files and aggregate data like visits, views, hits by url and store it in the cloud for ease of reporting. 3) Access the aggregated data by a third party reporting solution.

First, i am trying to figure out what are my options and what kind of setup I need.

sam
  • 347
  • 2
  • 10
  • 26

1 Answers1

0

I have a similar use-case as you. I use the tools NXLog -> Logstash -> Elasticsearch -> Kibana (The ELK stack). I have found this solution easy to use and scale. My use is very concerned with scale (>40,000 logs/second) and high availability (no downtime), and this has been a great solution. I'll address your concerns in the order you raised them:

  1. We use NXLog to ship the log files.

  2. NXLog ships these logs to Logstash.

  3. Which third party reporting solution? Could you give more information and maybe I can help with that too?

More resources about using the ELK stack for a logging use-case:

EDIT: Chartio integrates with Elasticsearch. You asked "Where in the process can I aggregate the data and store it so it is easily consumable by reporting application?" Elasticsearch does the data aggregation and storing. You store the data in Elasticsearch using the stack that I detailed above, and then you can integrate a reporting application such as Chartio.

fylie
  • 1,675
  • 1
  • 10
  • 14
  • In a similar use case (but with less logs, max 1000/s), I found that having a redis cache before the Logstash was useful. – baudsp Sep 23 '16 at 07:45
  • We will most likely use Chartio/Looker for reporting. But first, I have more questions, where in the process I can aggregate the data and store it so it is easily consumable by reporting application. – sam Sep 23 '16 at 15:01
  • I am starting to implement this and have few more questions... loggly plans seem pretty expensive. Is there another option? I am confused between nxlog and loggly - what is the purpose of each one? Why just ELK is not enough for what I want to do? How does Amazon Elastic Search service fit into what I want to do? – sam Sep 26 '16 at 16:35
  • You can use open source ELK - that should be perfect for what you do. I linked Elastic's documentation for installation. NXLog is a Windows agent that ships logs to ELK. Loggly is a paid cloud based log management provider which uses the ELK stack and adds more features to it. – fylie Sep 26 '16 at 18:14
  • I am trying to figure out the system requirements for the web servers (shipper/logstash forwarder) and the logstash server? I am looking for a solution that requires very little changes (not too many dependencies) on the web servers. The web servers are on windows. Do I need to install JAVA on all the webservers to work with logstash? – sam Sep 26 '16 at 21:36
  • You only need to install NXLog on the Windows servers, and then you have Logstash, Elasticsearch, and Kibana on a separate server(s). – fylie Sep 26 '16 at 23:13