2

I am looking to create a cron job that opens a directory loops through all the logs i have created and deletes all lines but keep the last 500 for example.

I was thinking of something along the lines of

tail -n 500 filename > filename

Would this work?

I also not sure how to loop through a directory in bash

Thanks in advance.

icelizard
  • 732
  • 3
  • 10
  • 20

2 Answers2

5

Have you heard about logrotate? I think it's not good to erase logs.

gogiel
  • 261
  • 1
  • 3
4
for file in *; do tail -n 500 filename > filename2 && mv filename2 filename ; done

Writing to the file you're reading is not a good idea. My solution is also not good as you may lost log between tail and mv command.
But anyway

for file in *;

is a way to loop over files of a directory.

But why don't you use logrotate ??

radius
  • 9,633
  • 25
  • 45
  • 1
    `... filename > filename2 && mv ...` instead of a semicolon will prevent the original file from being lost if the `tail` fails for some reason. – Dennis Williamson May 21 '10 at 12:08
  • Actually, filename would get truncated before the tail even ran. – MikeyB Feb 09 '11 at 01:41
  • `mv filename2 filename` is definitely NOT what you want if a process has `filename` open, as is frequently the case with daemon logs. If you do that the file gets overwritten but the daemon's file descriptor doesn't change (it still points to the old file), so it effectively ceases to log. `cat filename2 > filename` truncates the content with filename2's, which is probably more like what you want to do. – Eduardo Ivanec Apr 27 '11 at 12:58