-1

Hi guys is there any backup software that can take periodic backups of online website folders and store them offline on a local system. Need something robust and would be nice if theres something free that can do the job :)

Something that I could just install and let it run on a periodic basis. My local system is running on Windows XP

Ali
  • 279
  • 1
  • 3
  • 14

7 Answers7

5

Wget, Unison, Rsync or you could even try something with version control software.

I use them all, depending on context. Wget is nice to mirror a web site via ftp, Unison is nice to spread modifications on the same data in different places (and let you refine the hierarchy between them), Rsync synchronises well between two places, and subversion is nice to have a total backup solution that let you go back to the beginning and trace every modification of every file per line (not recommended for non-text files, although some won't do much difference - I use it on a complete chaotic company intranet).

No "easy" way though, at least if you want something thrust-worthy and efficient.

Berzemus
  • 1,192
  • 3
  • 11
  • 19
  • I think the Unison link is http://en.wikipedia.org/wiki/Unison_(file_synchronizer) – OtherDevOpsGene Jun 08 '09 at 12:06
  • I don't think it's possible to link to that particular page as the link is truncated slightly - sort it out serverfault! http://en.wikipedia.org/wiki/Unison_(file_synchronizer) – Andy Jun 08 '09 at 13:50
1

You can try The Windows Version of Bacula

Alakdae
  • 1,233
  • 8
  • 21
1

Similar to unison but superior imho, how about using dropbox (Linux, Windows and Mac clients available) and symlinking your web directory (exclusive of logs) to your dropbox folder?

Alternatively, instead of running a "hot copy" backup, you could periodically compress all the relevant web folders into an archive located in the dropbox directory, which would then automatically back that up for you online (as well as downloading to any other machines you have "dropbox"-ed). Then put a batch file in your XP startup folder that copies or moves all backup archives into another directory on your machine (n.b. if you move the file out of the dropbox folder on one machine, it is then removed from the online "folder", and finally the other clients synchronise with the online "folder" and remove the file too - there is still undelete functionality online if required).

If you're not familiar with dropbox here are a few of it's features (or here's a screencast):

Andy
  • 5,230
  • 1
  • 24
  • 34
0

I would recommend using wget.exe to download the complete website to a local folder running as a scheduled task.

0

For example: You can output your databases backups to a dedicated folder, and then write a simple script to FTP the content to your local server and then move the files transfered to an archive folder on the remote server.

Maxwell
  • 5,076
  • 1
  • 26
  • 31
0

if you also need "invisible" content to backup, then you could use rsync to synchronize your local backup with the online content

0

I think you're approaching it from the wrong side. I can see needing to bring down a copy of a database, but why the files? You should already have a copy of your published content locally in your development environment, preferably in a source control system.

If your application creates new files, or you just have to work on the live version, then there are a number of FTP programs that will do scheduled runs using a script. FTP Voyager, for example, will allow you to create a script to download your root FTP folder to a local folder, then schedule it daily. Granted, it's not free, but it is inexpensive and will get the job done.

Justin Scott
  • 8,798
  • 1
  • 28
  • 39