1

I have a VPS with Kloxo control panel, I want to have database backup from one of my websites in every 2 hour. How i can do this? Kloxo doesnt support Hourly backups, only support daily, weekly and monthly.

configuration: CentOS+Apache+PHP+MysqlAdmin

HopelessN00b
  • 53,795
  • 33
  • 135
  • 209
Fcoder
  • 341
  • 1
  • 4
  • 8
  • [Administration panels are off topic](http://serverfault.com/help/on-topic). [Even the presence of an administration panel on a system,](http://meta.serverfault.com/q/6538/118258) because they [take over the systems in strange and non-standard ways, making it difficult or even impossible for actual system administrators to manage the servers normally](http://meta.serverfault.com/a/3924/118258), and tend to indicate low-quality questions from *users* with insufficient knowledge for this site. – HopelessN00b Apr 09 '15 at 18:53

3 Answers3

2

Write a small script like:

#!/bin/bash
#
# Do the Backup
#
CURTIME=`date "+%Y%m%d-%H%M"`

mysqldump --user=<dbusername> --password=<dbpassword> --all-databases | lzma -c -9 -z >/backup/db-${CURTIME}.dump.lzma

and put this to crontab. (It would be easier to run in hourly: then you can just put the script in /etc/cron.hourly.

Andreas Florath
  • 294
  • 1
  • 6
2

Here is the bash script I have running on my webserver. It has served me well for some time now.

It includes a part at the end to remove old backups. You can specify the number of files you would like to keep in the variables section. You have to uncomment that part of the script to get it to run.

#!/bin/sh

#################################################################
#  Define your variables here:
#################################################################

FILESTOKEEP=7
BACKUP_DIR=/home/user/backups
BMYSQL_USER=mysql_user
BMYSQL_PWD=mypassword

DATE=$(date +"%m-%d-%Y")_$(date +"%T")

BMYSQL_HOST=localhost
BMYSQL_DBNAME=--all-databases
BMYSQL_DBFILENAME=MYSQL_BACKUP_$DATE


#################################################################
#  Make sure output directory exists.
#################################################################

        if [ ! -d $BACKUP_DIR ]; then
            mkdir -p $BACKUP_DIR
        fi


#################################################################
#  Create backup
#################################################################

        mysqldump --host=$BMYSQL_HOST --user=$BMYSQL_USER --pass=$BMYSQL_PWD $BMYSQL_DBNAME | gzip > $BACKUP_DIR/$BMYSQL_DBFILENAME.gz

#################################################################
#  Remove old backups 
#  - this will list files according to date (DESC)
#  - skip the first few files (FILESTOKEEP)
#  - remove all files past that
#  NOTE: Make sure not to save the backups into any directory
#  where there are other files other than these backup ones.
#
#  Uncomment when you are confident in rest of setup
#################################################################

#      cd $BACKUP_DIR
#      ls -t1 | tail -n +$(($FILESTOKEEP+1)) | xargs rm

I like saving my scripts into a directory in my user home. This may not be standard practice, but it sure makes it easy to find and edit them later on (read a couple years later). I would save this script as:

/home/user/scripts/hourly_backup_mysql.sh

To put that in your cron for every two hours open a terminal and type (as root):

crontab -e

In there, enter a new line like this:

0 0-23/2 * * * /home/user/scripts/hourly_backup_mysql.sh

(If you are unfamiliar with VIM, to add that line you will first need to press 'I' for Insert. Then go to the end of the file and press Enter for a new line. Paste the string and edit it. When you are finished editing the string, press ESC to leave the Insert mode. Then type ':wq', which will Write & Quit the VIM editor.)

This will now run every 2 hours and if you uncommented the remove section, will also remove old backups.

Cheers!

G Trawo
  • 153
  • 6
1

Per table db dump and gzip and rsync to some place else:

#!/usr/bin/ruby
require 'mysql'

rsyncTargets = [
        ["files1" , "/path/to/backup/dir/"],
        ["files2", "/path/to/backup/dir/"],
        ["files3", "/path/to/backup/dir/"]
          ]

tempDir = "/mnt/extra-space"
dumpUser = "root"

`rm -r /mnt/extra-space/*`

con = Mysql.real_connect('localhost',dumpUser,'','')

con.query('show databases').each do |db|
  print " Dumping - " + db[0] + "\n"
  Dir.mkdir("#{tempDir}/#{db[0]}")
  con.query("use #{db[0]}")
  con.query('show tables').each do |table|
    `mysqldump -u #{dumpUser} #{db[0]} #{table[0]} | gzip > #{tempDir}/#{db[0]}/#{table[0]}.sql.gz`
  end
end
con.close()

rsyncTargets.each do |server|
  `rsync -av --delete /mnt/extra-space/ rsync://#{server[0]}:#{server[1]}`
end

Taken from: http://techhelplist.com/index.php/tech-tutorials/42-databases/80-ruby-script-for-mysql-dump-gzip-cron-jobs

user145837
  • 371
  • 5
  • 18