I have a Django project and I need a workflow in which my git folder is separated from my server.
In other words: I am editing files in
~/git/web-framework
I am running the server (./manage.py runserver) from
~/srv/web-framework
Currently, I use rsync to update my server folder whenever I change the source code. This takes very long. I think this can be done faster by checking which files are modified with git and then just copying specifically those files to the server location. Can someone help me with a shellscript how to do this? I am running Ubuntu. Or does someone know another faster solution?
Extra information: I am editing all files from my Windows Host machine, while Ubuntu in a VirtualBox is running the server at the same time. If I use the shared folder, the whole server is running incredibly slow (Sharing a folder with VirtualBox just makes it incredibly slow). I want to be able to keep using git commands from windows ánd ubuntu inside this shared folder though.
rsync usually takes up to two minutes
real 1m34.494s
user 0m0.218s
sys 0m15.264s
Furthermore, even when ALL files are up to date it takes long and with the option -vv this might be some relevant information:
total: matches=0 hash_hits=0 false_alarms=0 data=0
sent 1,054,073 bytes received 3,441,985 bytes 42,616.66 bytes/sec
total size is 336,117,027 speedup is 74.76
With the command -v, the output says it has done fewer things (still without changes in files):
sent 939,612 bytes received 11 bytes 9,838.98 bytes/sec
total size is 336,117,027 speedup is 357.71
During the run, it detects a lot of skipped *.pyc files (because I exclude those) and it has a lot of file x is uptodate messages.