1

I have a shell script running in AIX that will do the backup for each SVN repository every night. The execution is as follow:

Download the each repository save as a dump file:

svnrdump on repo_A

svnrdump on repo_B

svnrdump on repo_X ## X is the subsequent number of repository

After this download process complete, the content will be zipped:

bzip2 --compress repo_A_yyyymmdd.dump

bzip2 --compress repo_B_yyyymmdd.dump

bzip2 --compress repo_X_yyyymmdd.dump ## X is the subsequent number of backup copy

Then I will do a housekeeping on the backup copies by removing those copy that is longer than 5th day old from today (I only need to keep the last 5 latest backup copies):

rm `ls -t repo_A_????????.dump.bz2 | tail -n +6`

rm `ls -t repo_B_????????.dump.bz2 | tail -n +6`

rm `ls -t repo_X_????????.dump.bz2 | tail -n +6` ## X is the subsequent number of backup copy

My question

Is there a way to optimize this code by putting those repo name into an array and then execute every command for each repo in a loop?

huahsin68
  • 147
  • 3
  • 11

2 Answers2

1

Try something like this:

First form, array of suffixes:

#!/bin/bash

array=(A B C D E F)
for i in "${array[@]}"
do
    echo svnrdump on repo_$i
    echo bzip2 --compress "repo_$i.dump"
done

Second form, if there repo names are more diverse:

#!/bin/bash

while read REPO; do
    echo svnrdump on  "${REPO}"
    echo bzip2 --compress "${REPO}".dump
done << REPOLIST
repo0
repo1
repo2
repo3
REPOLIST

You will have to add all repos to the REPOLIST.

Sven
  • 98,649
  • 14
  • 180
  • 226
1

Or alternatively:

repodir=/some/path
backupdir=/some/otherpath
maxagedays=6
for repo in repo_A repo_B repo_X; do
   svnadmin dump $repodir/$repo | bzip2 -9 > $backupdir/$repo-`date +%F`.dump.bz2
done
find /$backupdir -type f -iname *dump.bz2 -mtime +$maxagedays -exec rm -f {} \;
Reiner Rottmann
  • 633
  • 1
  • 7
  • 19