1

We have processes that are producing logs using log4j, and some of the log entries are supposed to be loaded in a database for analysis and reporting (right now everything is going to files). The problem is that some of those processes don't have access to the database. So the idea is that every process is producing a file, which is then sent to/read by another process who has access to the DB.

The preferred format for this file is a standard log4j text format, so that the same file can be used both by the process that loads it in the DB and by real people for reading it. So the question is: is there an existing log file parser (ideally a java library)? We don't want to invest time to write a parser.

Another solution would be to generate two files, one for reading by humans, and the other one with, for instance, serialized log4j logging event that could be easily deserialized, but for now my management is not buying this...

There may also be other solutions that I'm not seeing, so any suggestion is welcomed

Xavier
  • 1,536
  • 2
  • 16
  • 29
  • here is a link to a similar question: http://stackoverflow.com/questions/2327073/parse-a-log4j-log-file – IllegalArgumentException Feb 08 '12 at 19:34
  • I saw this one, but the answers are referring to either another appender (which I want to avoid), UI tools (sawmill, chainsaw) or manual parsing, and I want to avoid all of these. – Xavier Feb 08 '12 at 20:18

3 Answers3

2

You can try http://code.google.com/p/jlibs/source/browse/trunk/greplog/src/main/java/jlibs/util/logging/LogParser.java

the grepLog module in jlibs provides grep functionality for log files. you have to define how your log record looks like in xml file. you can find the schema for this at http://code.google.com/p/jlibs/source/browse/trunk/greplog/resources/schemas/header.xsd

Santhosh Kumar Tekuri
  • 3,012
  • 22
  • 22
0

There are a lot of "log management" products. A simple google search should give a lot of hits. I've heard for things about splunk and sawmill, but I havent tried them out in a real production environment.

http://splunk-base.splunk.com/answers/767/how-do-i-setup-splunk-to-index-log4j-with-orgapachelog4jrollingfileappender

Peter Svensson
  • 6,105
  • 1
  • 31
  • 31
  • That's not really what I'm looking for. From what I understand, splunk or sawmill are loading the data in their own structure/DB, and then you use their reporting/analyzing tools. Here, I want to load the data in our own custom log tables, to use internal, pre-existing tools that are using these tables. – Xavier Feb 08 '12 at 20:15
  • If you have a proprietary format/db to adhere to I think you're stuck with writing your own parser/db implementation.. – Peter Svensson Feb 08 '12 at 20:21
0

You can write a custom appender that generates SQL statements you can run later or some other easily parseable format.

Erik Ekman
  • 2,051
  • 12
  • 13
  • I want to have the same file containing the data to be parsed and the data to be human-readable. SQL statements are somewhat human-readable, but not really user-friendly when it comes to reading logs – Xavier Feb 08 '12 at 21:47