0

Currently I have a bunch of local copies of dev/production websites. Each copy contains the "files" directory, which contains files uploaded by site users. Currently I use rsync to synchronize the directories contents from remote servers (via ssh).

There are some annoyances:

  1. I have to run rsync manually each time when I want fresh files (this could be automated of course, but as I have a lot of website copies, it's not a good idea).
  2. The rsync execution takes some time.
  3. Disc space on my laptop is running out.

I think all of this could be solved if there is some kind of a software that can work like a proxy:

  1. When I list files, it requests the file list from the remote server and caches the results for some (configurable) time.
  2. When I first time request file contents, it retrieves the remote file and saves it locally.
  3. When I update a file, it only gets updated locally.
  4. When I save a new file in the "files" directory, it not goes to the remote server.

Of course, the logic of such software should be much more complex, but I hope, my idea is clear: don't waste disk space, download files on demand, no remote changes.

Is there any software that works like that?

Leksat
  • 2,923
  • 1
  • 27
  • 26

2 Answers2

0

Map a network drive with NFS or sshfs. Make local copies if you really need a file.

  • I don't know which files I need. When I work with a local website copy, the website engine retrieves some files from the "files" directory when it needs them. I can't predict which files it would need. – Leksat Feb 06 '15 at 12:47
0

I did not mention it in the question, but I needed this for work with Drupal. And now I have found a Drupal-only solution, the Stage File Proxy module.

It does exactly what I need: downloads files from a remote server only when they are requested.

Leksat
  • 2,923
  • 1
  • 27
  • 26