-1

I am trying to write a script which can delete an old file from a directory when a directory exceeds a certain limit. Here is my script

#!/bin/bash
#incremental backup of upload folder only
LIMIT=2
TIME=`date +%b-%d-%y%s`
FILENAME=backup-$TIME.tgz
SRCDIR=/home/Man/blabla
DESDIR=/home/Man/newdir
EXCFILE=/home/Man/blabla/up
if [ $LIMIT -gt 2 ]; then 
cd $DESDIR
ls -lt | grep .tgz | tail -n 1 | xargs -r rm
tar -cvzf $DESDIR/$FILENAME $SRCDIR --exclude=$EXCFILE 
else
tar -cvzf $DESDIR/$FILENAME $SRCDIR --exclude=$EXCFILE 
fi

But it is not working it creates backup but not deleting the old file after the dir exceeds the limit

Mansoor
  • 83
  • 10

2 Answers2

0

Your question is unclear (we can just guess what "limit" means, but we don't understand what it is), and lacks motivation.

At the system call level (see syscalls(2)), the relevant system calls are setrlimit(2) with RLIMIT_FSIZE then signal(7) related ones (e.g. sigaction(2)) with SIGXFSZ.

So you might use the ulimit builtin and the trap one. However, a program executed by your shell script might change the file size limit, and could catch the SIGXFSZ signal.

Beware, the size of a directory (as given by stat(2), so stat(1) & ls(1)) is not the accumulated size of its files (since some files could be hard-linked in several directories; files are inodes - see inode(7)). It is just the size of directory entries.

Alternatively, you might compute the accumulated size of files (using du(1) or find(1) or gawk(1)) and take some action according to it, maybe deleting some big files (But consider -perhaps by deciding to ignore them- other processes writing files or directories during the execution of your script).

Perhaps you could be interested by disk quotas (see quotactl(2), quota(1), quotacheck(8))

You could do incremental backups. You might use other utilities than tar(1) for that purpose (e.g. dar, afio, ...). Notice that mv(1) & cp(1) & tar have some --backup option. Perhaps you might consider logrotate(8) to deal with growing log files.

Basile Starynkevitch
  • 223,805
  • 18
  • 296
  • 547
0

Why i didn't used logrotate because that was not the requirement. Anyways I have solved it by myself. I add a count funtion which counts the .tgz files in a directory and whenever $count exceeds $limit the old file will get remove.

LIMIT=3 
COUNT=$(ls $DESDIR | grep '\.tgz$' | wc -l )
if [ "$COUNT" -gt "$LIMIT" ]; then
cd $DESDIR
ls -t | grep .tgz | tail -n "+$(($LIMIT + 1))" | xargs -r rm && tar -cvzf $DESDIR/$FILENAME $SRCDIR --exclude=$EXCFILE --exclude=$EXCSNRFILE
else 
echo "script is not working"
Mansoor
  • 83
  • 10