3

I have a very limited experience with shell scripting and would like to put in place some sort of script that backs up my blog (Wordpress) on a weekly basis.

Here is what I have so far

#!/bin/bash

# Determine current date
setenv CURDATE date +%Y%m%d

# Backup DB & email it to me
mysqldump dbname -u user -ppassword | gzip | uuencode ${CURDATE}dbname.sql.gz | mail -s "backup for dbname ${CURDATE}" my@email.com

cd /home/myhome

# Zip blog
tar cf - blog.mysite.me | gzip - > ~/backups/${CURDATE}blog.mysite.me.tar.gz

And this is where I am a bit stuck... I was thinking about emailing myself the blog directory but what happens when this becomes bigger then 10MB or so? how would I script this to split this and email me the chunks?

Another suggestion I have which I feel is better is to FTP the backups to another VPS that I own. But for the sake of space I would like to only keep the last 10 backups. How would I implement the part of the script script that:

  1. Upload backup
  2. Get list of files
  3. Gets count of files in the current dir (e.g. /home/myhome/backups/blog )
  4. If count > 10 delete oldest

Any help/advice or pointers with solving this problem would be much appreciated :)

Malachi
  • 441
  • 2
  • 9
  • 18

2 Answers2

4

Rather then zipping it and ftping, use rsync over the network. If you make a new directory for each date, and use the --link-dest option to link it back to the previous directory, it will only store the changed files, and files that don't change will be a hard link to the previous one.

I posted some code at What backup solution you use for linux servers

Paul Tomblin
  • 5,225
  • 1
  • 28
  • 39
  • This is what I'm using for backups. It's really nice, because I have over 330GB of data backed up with a week's worth of daily snapshots (i.e. full, not incremental, backups), and yet the filesystem hosting those 7 days of backups uses the same amount of space! Expanding it to more days, or by adding weekly and monthly snapshots, would be the same story. – Kromey Apr 19 '11 at 21:45
4

First, to set a variable from a command use

CURDATE=$(date +%Y%m%d) 

instead of

setenv CURDATE date +%Y%m%d

If you want to leave only 10 days of backup, you can do that deleting the backup that was made 10 days ago. Use date relative commands to find it and save to another variable:

PAST_DATE=$(date +"%m-%d-%Y" --date="10 days ago")

Second, you can use lftp to execute ftp commands on a single line, like this:

lftp -u user,pass server -e "mrm *${PAST_DATE}.sql.gz; exit;

If you use the date on the file names, and each day remove the files that are 10 day old, you can do exactly what you are thinking. You can use lftp to put the files too, it works like a regular client, just that it works on a single line.

coredump
  • 12,713
  • 2
  • 36
  • 56
  • If you have shell access of some kind to the remote host (likely?) you can also use find to delete files older than 10 days: `find /backup/dir -type f -mtime -10 -exec rm -f '{}' \;`. This is probably simpler and more robust than figuring out the filename of the right file to delete. – Eduardo Ivanec Apr 19 '11 at 21:27