0

So I have a GNU/linux server running apache with mod_dav. Users can upload their backups into their respective user folder.

I would like to write a scanner that scans those directories and passis it along to a backup program and remove the local copies afterwards. (e.g. hashbackup or the like)

One problem here is concurrency. How would I best write this so doesn't back up directories that are still (partially) uploading? E.g. /dir could be almost complete except for that one 12GB file that is still uploading.

I could just check the last modified <= 30 minutes, but large files could be skipped if they upload a very long time..

Any ideas or suggestions?

Niels
  • 11
  • 3
  • That seems rather complex and error-prone. What advantages does the current system offer? Why do local copies need to be removed? Why aren't backup clients talking directly to the backup server software? – tgharold Apr 06 '16 at 10:45
  • @tgharold; it's actually just an S3 gateway that compresses and deduplicates for users. It's to take the complicated matters out of the hands of the users and just offer them a cross-platform medium of storage. – Niels Apr 08 '16 at 06:51

0 Answers0