1

I have a Microsoft Azure / Visual studio online repo managed with Git. I am using the Git GUI application to manage it.

I have a couple of files that are 535 MB and 620 MB in size. I would like to add these to the repo.

I have enabled Git large file support, and i have set the global post buffer with the command:

git config --global http.postBuffer 1048576000

No matter what I do, I cannot seem to add these files. The commit is fine, but when I push to the remote branch, I get:

POST git-receive-pack (547584390 bytes)
error: RPC failed; HTTP 503 curl 22 The requested URL returned error: 503
fatal: the remote end hung up unexpectedly
fatal: the remote end hung up unexpectedly
Everything up-to-date

As far as I know, adjusting the buffer like this should work in this case. What am I missing?

anti
  • 3,011
  • 7
  • 36
  • 86
  • It seems that you have added the large files just fine, but that your remote does not accept that amount of data. This is thus not a programming question. – gspr May 10 '20 at 16:38
  • Hi thanks for your reply. It is a Git question, rather than a programming question. I will adjust the title to more qccurately reflect what I am asking. – anti May 10 '20 at 16:54
  • “have enabled Git large file support” How? – matt May 10 '20 at 16:54
  • Thanks for your reply. I followed this: https://git-lfs.github.com/ – anti May 10 '20 at 16:55
  • Ok but that’s for GitHub. You are not using GitHub. – matt May 10 '20 at 17:27
  • 1
    @matt It works for Azure DevOps services too: https://devblogs.microsoft.com/devops/announcing-git-lfs-on-all-vso-git-repos/ – VonC May 10 '20 at 17:28
  • Consider using a universal package feed instead of putting these large files in source control. – Daniel Mann May 10 '20 at 17:29

2 Answers2

1

Activating LFS locally (git-lfs.github.com as you mention) is a good first step.

Check also the prerequisites and limitations at Azure DevOps Azure Repos / Use Git Large File Storage (LFS)

Finally, if you just added/committed the large file, it is better to reset that commit (assuming you don't have any other work in progress), and then track it through lfs:

git reset @~
git lfs track lyLargeFile
git 
VonC
  • 1,262,500
  • 529
  • 4,410
  • 5,250
1

After fighting with git for weeks, I have finally solved this.

the answer for me was to clone the repo again using SSH instead of HTTP.

anti
  • 3,011
  • 7
  • 36
  • 86
  • Interesting. I didn't know there was size limitation through HTTPS only. – VonC Jun 10 '20 at 12:15
  • It may be to do with my internet connection being flaky / slow. Seems SSH is much more stable for long upload times – anti Jun 12 '20 at 09:36