I have a customer with my app and a local copy of its .git repository, which she uses to revert changes and test patches, etc. But there is no simple way she can pull a remote with the latest changes due to firewall and other rules. So I thought "just let them download a new, full .git repository zipped up". But the repository is a big mama to move around comfortably, so I would prefer to just send her a zipped up .git of incremental changes (objects, refs, etc) to be decompressed inside her local .git dir.
I've tried the following workflow (automated from within my software), based on the file timestamps:
- get the most recent timestamp in her .git/objects directory
- create a zip from my .git dir of files starting on that timestamp
- email zip file
- decompress zip inside customer's .git (it doesn't matter if some files are overwritten)
git checkout -f master
(or whatever branch/tag they need) andgit clean -d -f
It works fine, since her working tree and .git directories are never modified locally, only checked-out from one branch to a tag or whatever. But file timestamps are definetly unreliable... ie it may all get touch
ed by some backup or whatever, so I'm looking for a more elegant way to create this incremental zip using a commitish of some sort.
So, how could I programmatically create an incremental zip file of my .git directory given I have information from the customer's local .git? I guess this is very similar to what git does internally to pull things from remote incrementally. What's the algorithm to select the correct files? How do I determine the baseline commit from the customer's repository to build the incremental zip?