Currently I have a bunch of local copies of dev/production websites. Each copy contains the "files" directory, which contains files uploaded by site users. Currently I use rsync
to synchronize the directories contents from remote servers (via ssh
).
There are some annoyances:
- I have to run
rsync
manually each time when I want fresh files (this could be automated of course, but as I have a lot of website copies, it's not a good idea). - The
rsync
execution takes some time. - Disc space on my laptop is running out.
I think all of this could be solved if there is some kind of a software that can work like a proxy:
- When I list files, it requests the file list from the remote server and caches the results for some (configurable) time.
- When I first time request file contents, it retrieves the remote file and saves it locally.
- When I update a file, it only gets updated locally.
- When I save a new file in the "files" directory, it not goes to the remote server.
Of course, the logic of such software should be much more complex, but I hope, my idea is clear: don't waste disk space, download files on demand, no remote changes.
Is there any software that works like that?