We have a repo that is high value and expensive to disrupt.
It also has large XML files that cause significant problems when using a webapp for merges and git log history. In addition it is likely making our CI/CD inefficient. In other words ... the regular things that spur people to move to git-lfs.
We want to carefully do this. One file at a time.
I have seen approaches similar to what's listed below:
cp *.xml ~/tmp
git rm *.xml
git commit
git lfs track *.xml
git add .gitattributes
git commit; git push
In a fresh directory:
git clone --mirror $remote; cd repo
bfg --delete-files '*.xml'
git reflog expire --expire=now --all && git gc --prune=now --aggressive
git push
Back in src:
mv repo repo.bloated
git clone $remote; cd repo
cp ~/tmp/*.xml .
git add *.xml # (it now puts them in lfs)
git commit; git push
How can I do something similar but just start with one large xml file to mitigate risk during this transition. We prefer to be in easy contact with the dev maintaining that file, isolate changes and crawl here. Changing 100s of files could hold up developers and be expensive.
Do we just change *
to the specific file name in the above example?