2

How can I incrementally backup a maildir to a cifs share without creating a full image each time?

I have dovecot running with some large mail accounts. The only backup available is on a Windows server.

I would prefer a solution somewhere between using rsync on the whole directory structure and creating an archive of each mail account (or even the whole dovecot mail directory).

I have considered creating a script to traverse the directory structure and create an archive of each folder if there has been any updates according to timestamp. If necessary, I could probably base64 encode all the directory names. I don't need to preserve any user permissions as the whole mail directory is using the vmail user. Are there better solutions? Do I also have to consider any maildir locking?

Rob
  • 141
  • 1
  • 6

2 Answers2

2

Not sure you will find a better solution than mounting the Windows share and running rsync. Is the issue that all files are being re-transferred every time?

With the correct rsync options that should only re-transfer files that have changed, and can delete any files in the backup that are no longer in the source.

Maildir doesn't require file locking for read, so the rsync doesn't have to worry about locking the source files.

Corvar
  • 91
  • 9
  • I have considered rsync, and I may default back to it for this application. Normally it just transfers the changed data, but for some reason does a full backup when DST changes. – Rob Jun 29 '15 at 14:21
  • I have heard that NTFS doesn't handle large quantities of small files well though, so I was trying to avoid putting extra load on the already busy NAS. Also, because it is email, I was thinking of tar.xz and running gpg on each directory before backup: this is not critical if there is already a pre-existing ideal solution, but would be a bonus feature. – Rob Jun 29 '15 at 14:27
  • I would say that the change of DST would be "fix"able using the --modify-window. As for large quantities of small files, it seems like most filesystems start suffering when there are a very large number of small files in a single directory. If you wanted to do per-directory tarballs, you could use the incremental feature of tar. But I haven't used that much. – Corvar Jun 29 '15 at 14:40
2

Using rdiff-backup would save you a fair bit of scripting, plus it would be hard to achieve the same level of efficiency with a homegrown solution. You would simply get an archive of changes to the file system you back up, on which you could run cleanup jobs to discard backups older than so or so.

If you can start an rdiff-backup server process directly on the Windows host, that would save you the most disk space, but you would not get the backup of the backup.

Alternatively you could run rdiff-backup to a local backup directory on your dovecot server and archive the content using rsync from your dovecot or robocopy from Windows. This all assumes you can connect reliably using for instance cifs or samba.

ErikE
  • 4,746
  • 1
  • 20
  • 27
  • I have looked at rdiff-backup in the past, but I normally just use rsync to keep the current backup up-to-date: it is good to have a reminder of it though as I do have other applications where this would be useful. However, in this instance I suppose a little clarification is in order: since the backup drive is on a NAS and is already replicated and incrementally backed up I don't need an incremental backup. I also don't want to impose any additional load on the already busy nightly imaging/backup. – Rob Jun 29 '15 at 14:14