I need script that help me to backup my network's files on the server, using mixed strategy of incremental and differential. 'tar' can do something like that but the problem is extracting the about 30Gb file is time consuming and in not efficient. I think there should be a method that can backup files and their indexes in two different files, as a consequence in the case of disaster, I can find easily the required files in the index file and then extract them.
-
Tank you all, but I need the backup software to produce index file during backing up my files, because creating index file during restoration for about 30Gb file requires extracting that 30Gb backup file and then producing the index file which is time consuming. Am I clear?????? please help;) – Amir Ashoori Jun 29 '09 at 07:20
6 Answers
Have you taken a look at rdiff-backup?
http://rdiff-backup.nongnu.org
rdiff-backup backs up one directory to another, possibly over a network. The target directory ends up a copy of the source directory, but extra reverse diffs (ie. differential backup) are stored in a special subdirectory of that target directory, so you can still recover files lost some time ago.
The idea is to combine the best features of a mirror and an incremental backup.
rdiff-backup also preserves subdirectories, hard links, dev files, permissions, uid/gid ownership, modification times, extended attributes, acls, and resource forks.
Also, rdiff-backup can operate in a bandwidth efficient manner over a pipe, like rsync. Thus you can use rdiff-backup and ssh to securely back a hard drive up to a remote location, and only the differences will be transmitted. Finally, rdiff-backup is easy to use and settings have sensical defaults.

- 11,394
- 3
- 37
- 45
-
1I use rdiff-backup for my backups at home and at work. I keep 30 days of history, it's really great. When you delete a file by mistake, you can simply mount your backup (keep it unmounted when not needed...) and grab the file ! At work I save 20 machines with it on a NFS share, it works great, I back up about 200 GB with it. – wazoox Jun 28 '09 at 17:43
Dirvish does what you want in terms of incremental backups, and indexing backups with a utility to search them

- 8,365
- 1
- 31
- 44
Most "backup systems" (as opposed to individual backup tools like tar or zip) support indexing, including intelligent restore using that index.
As you haven't given details of your environment but have mentioned tar, I'll suggest you look at something like Amanda (http://www.amanda.org/). It can handle backing up Windows systems as well if you need that capability.

- 6,689
- 1
- 26
- 24
I will chime in and strongly suggest 'backup2l' as a simple solution for creating backups. It'll handle full and incremental backups, and allows you to define your own "ruler" policy. Basic CLI tools support file restore, although the tool uses standard file types like 'tar' (optionally combined with gzip/bzip2) for storage.
If you want to backup a whole bunch of boxes and have a "CSR proof!" web interface for it all then I'd also recommend reading up on BackupPC which is an excellent tool.

- 874
- 5
- 14
BackupPc runs on a Linux server and backs up files from Windows, Macs, and PCs. It can keep multiple full and incremental backups, and gives a browsable web interface to all versions of all files.

- 520
- 6
- 12