1

I have a Linux process generating very big log files. Those file could grow infinitely if I didn't do anything. Is there any way to limit a file's size and to make it act like some kind of fifo buffer keeping only a certain amount of data?

I also tried logrotate, but it can't run as soon as the file has reached a given size. Log files could grow very fast and I don't want to run logrotate on a daily basis.

Thanks for your help.

user194998
  • 21
  • 1
  • 2

2 Answers2

4

You can run run logrotate with a config file specifically for the log file in question and put it into a cron job that runs more often, e.g. every hour or every 15 minutes.

See man logrotate.

Sven
  • 98,649
  • 14
  • 180
  • 226
0

Try:

   /var/log/filexxx {
       rotate 5
       weekly
       size 100k
   }

This will always keep 5 files, rotate either weekly or when it reaches 100k in size (what happens first).

Tomas
  • 106
  • 3