1

I have been running a large-ish site for years, with a typical nginx / apache setup, and all the "pages" are mod_perl. Up until recently, I was running on FreeBSD. After a hardware-replacement, and due to other reasons, I was forced to migrate to Ubuntu (12.04.2 LTS), which I use on many other servers, so no big deal. However, I have a problem with my logs now. For some reason, more and more "actions" are no longer logged through Log4Perl. This was never a problem on my previous setup, but now I seem to "lose" between 2 and 15 % of my log-entries. This is checked and verified by logging the data to a database at the same time.

Does anyone have a clue why this would happen? Is there something I should know about large log-files and ubuntu? (It's not that large tbh, 390 MB atm)? I get nothing in my error-logs anywhere, and as the database-logging happens AFTER the $log->info("ENTRY HERE"), the script obviously doesn't crash. But I am missing a lot of those ENTRY HEREs :) The log in question is "hit" about once per second on average, but I should not think this would be a big problem? Could there be "too many processes" trying to write to the log in parallel, causing locking-issues, preventing data form being appended to the file? Any typical Ubuntu-settings that might be adjusted for something like this?

Any help would be greatly appreciated.

  • Spinner
Spinner
  • 11
  • 2
  • Where's the code? http://sscce.org/ http://www.chiark.greenend.org.uk/~sgtatham/bugs.html#showmehow – daxim Sep 25 '13 at 08:15
  • It would be helpful also to see the log4perl config. – jane arc Nov 04 '13 at 18:20
  • Also, how much are you logging? I'm assuming you're logging to syslog. By default, Ubuntu's syslog daemon starts dropping log entries from apps that log "too much" (and makes a note of that in the log file). – Martijn Jan 18 '14 at 07:40

0 Answers0