0

We have a web site which produce roughly 100GB of IIS Web log files every month. We would like to get statistics from these logs, for instance the most visited URL's, authenticated users with most requests etc etc. We would also like some basic drill-down functionality, for instance to see what URL's the most frequent user accesses.

We've tested a few different tools for this but not found one which does't run into performance issues. For example, we tried SmarterStats and gave it 500GB of logs but it ran with 100% CPU usage for days and a few days later the service crashed. I know about Microsoft Log Parser, but what I'm looking for is a tool to generate easy-to-understand graphs which allows me to drill down.

Is there any tool to accomplish this, which can handle terabytes of log files?

nitramk
  • 203
  • 1
  • 3
  • 6

1 Answers1

0

I've used IIS log parser after that the logs can be sent over to MSSQL with SSIS and from the DB all kind of graphs can be created .

Initially to sort the logs and prepare them for the log parser I've used power shell which orders them , with 7zip to transport them between servers .

Alex H
  • 1,814
  • 11
  • 18
  • How many terabytes of data have you tried analysing this way? – nitramk Sep 20 '13 at 11:31
  • In log file format initial size a few GB daily for @ 2 years now . When they reach the Database you can make shrink the data and optimize it and so on ... – Alex H Sep 20 '13 at 11:38
  • By the way I think you should ask for X amount of disk space occupied by logs , daily processing ,I think it might help others to understand on how much logs you must parse . – Alex H Sep 20 '13 at 11:48