0

I have an Ubuntu 12.04 server. Server is running rsyncd (rsync daemon), and allows an external process to rsync in files to a given path, with full control of the file structure within that given path.

These files will be ultimately used by multiple environments, in a staged fashion:

  • immediate/live access for "dev", updates multiple times per day
  • frozen copy of "dev" for "stage", updates manually, generally every couple of weeks
  • frozen copy of "stage" for "prod", updates manually, generally every month or so

I'm attempting to find out what the best process is to be able to essentially version these files for each "frozen" release. These are binary files, so something like Git isn't ideal. I've heard that hard-linking could be useful in such a situation, but I'm not sure if that's fully applicable or where to begin looking.

A full solution is appreciated, or failing that, links/documentation pointing to either existing software or a general related solution would be fine.

EDIT: just a note, while the full directory tree could grow to several gigabytes, generally the only changes are file additions, with occasional (rare) file updates.

Jon L.
  • 318
  • 2
  • 9

1 Answers1

1

I think that Dirvish is what you are looking for. Take a look on rdiff-backup too.

Mircea Vutcovici
  • 17,619
  • 4
  • 56
  • 83
  • How about a solution for envs hosted on same box, while avoiding over-duplication of files? (that's why I was thinking of hard-links) Two envs are hosted on 1 machine, Two more envs are on separate machines each. Contemplating even hosting it all from a single box or network share, again using some type of versioning to serve each env – Jon L. Mar 30 '13 at 23:49
  • Dirvish is using hardlinks. If you want a deduplication fs, then you can use `lessfs` and `opendedup`. – Mircea Vutcovici Mar 31 '13 at 02:44