So I have a GNU/linux server running apache with mod_dav. Users can upload their backups into their respective user folder.
I would like to write a scanner that scans those directories and passis it along to a backup program and remove the local copies afterwards. (e.g. hashbackup or the like)
One problem here is concurrency. How would I best write this so doesn't back up directories that are still (partially) uploading? E.g. /dir could be almost complete except for that one 12GB file that is still uploading.
I could just check the last modified
<= 30 minutes, but large files could be skipped if they upload a very long time..
Any ideas or suggestions?