1

Yesterday my dedicated webserver crashed due to hardware failure. It has been replaced with another server by Leaseweb. It's the same specs but another server so it needs to be re-installed. That means I have to make some backups before it's re-installed as it can't just be cloned.

The server is now booted up with GRML (Linux Live) and I have acces to it through SSH and SFTP.

What would be the best way to back everything up? I don't want to miss anything. I need MySQL databases, files and so on.

Any tips?

4 Answers4

1

you could do a local dump of your databases to a tarball and rsync them to another server, like :

#!/bin/sh
SNAPSHOT_DATE=`date '+%d%m%y_%Hh%M'`  
LOCAL_TARBALLS=/tmp
# COMPRESSION MODE  
GZIP="$(which gzip)"  
[[ -z $GZIP ]] && aptitude -y install gzip  # for Debian's like
GZ_COMPRESSION_LEVEL="-9" # 1=low compression but fast, -9=high compression but slow  

# ##############  
# MYSQL dump  
# ##############  
TARBALL="mysql"  

# guess binary names  
MYSQL="$(which mysql)"  
MYSQLDUMP="$(which mysqldump)"  

# do it Safe  
[[ -z $MYSQL ]] || [[ -z $MYSQLDUMP ]] && echo "mysql commands not found" && exit 1  

# mysql version  
$MYSQL -V > $LOCAL_TARBALLS/$TARBALL/myql_version  
DBS="$($MYSQL -u $MUSER -h $MHOST -p$MPASS -Bse 'show databases')"  
# dump dbs and generate one tarball by database  
for db in $DBS  
do  
 FILE="$LOCAL_TARBALLS/$TARBALL/mysqldump_$db-$SNAPSHOT_DATE.sql"  
 $MYSQLDUMP -u $MUSER -h $MHOST -p$MPASS $db > $FILE && $GZIP $GZ_COMPRESSION_LEVEL $FILE  
done  
# Or generate ONE dump for all databases  
# FILE=$LOCAL_TARBALLS/$TARBALL/"mysql_dump".$SNAPSHOT_DATE".sql"  
# $MYSQLDUMP -u $MUSER -h $MHOST -p$MPASS --all-databases > $FILE && $GZIP -9 $FILE  
else  
echo "Mysql: No backup of databases have been performed (because no or wrong credentials have been entered)"  
fi  

You will have to adapt some data in the shell, given as an hint.

Same applies to your data (/home and so on), I mean using rsync your backup tarballs (rsync is using ssh). Even these rsync can be managed in a cronjob sothat the backup are done regularly.

Hope it helps.

hornetbzz
  • 170
  • 9
0

You can backup your temporary mount point of your server using tar over SSH:

ssh youraccount@yourremoteserver "tar jcfp - /livemntpoint" > mybackup.tar.bz

The preserve would allow you to keep the same userid/groupid if you want to keep a similar setup of users/permissions between the two systems.

0

rsync is the best way to backup your files. If something crashes, start rsync again and it will continue from the last backuped file and not from the beginning.

But before that, you have to backup your database :)

Blagomir
  • 306
  • 2
  • 4
0

There are many ways to do backup of your web files and your database, personally I would recommend imaging your server instance at a good state, assuming your hosting provider offers that feature. In addition to that, have a look at duplicity which allows you to backup and encrypt your files. You will have the option to store your backups on local file storage, scp/ssh, ftp, rsync, HSI, WebDAV, Tahoe-LAFS, and Amazon S3 are supported, and others.

I recently implemented this for a client using Jenkins (master and slave) to manage the job, with notification if there is a failure. I prefer this method, as it allows me to manage the backups as job rather than using cron on each individual server. If you ever need help with an implementation like that let me know.