0

I have a file that exceeds the upper limit of 100MB for GitHub enterprise.

It was added to the repository before lfs support was installed.

Then I added it to git lfs and told it to track.

matt@ORAC:~/dev$ git lfs ls-files
83274a0d67 * extern/cudnn/bin/libcudnn.so.7.2.1

However, when pushing I still get this error.

remote: error: GH001: Large files detected. You may want to try Git Large File Storage - https://git-lfs.github.com.
remote: error: File extern/cudnn/bin/libcudnn.so.7.2.1 is 274.67 MB; this exceeds GitHub Enterprise's file size limit of 100.00 MB

How do I fix it?

hookenz
  • 36,432
  • 45
  • 177
  • 286

2 Answers2

2

Try git add .gitattributes

If that doesn't work, try

git stash git reset HEAD~ # Or HEAD~X to go back X commits to a point where the file didn't exist git stash pop git add . git add .gitattributes git commit -m "Msg" git push

Nicholas Pipitone
  • 4,002
  • 4
  • 24
  • 39
  • This is what git lfs is for. Only I forgot to track this file at the beginning and I'm not sure how to make it track it properly. – hookenz Oct 01 '18 at 21:23
  • This might actually work but I can't verify it. I actually ended up starting over. Checking out the repository and adding it all back in correctly and now I don't have this issue. – hookenz Oct 01 '18 at 23:22
0

if you have a lot of different large files checked in before git lfs is enabled, the most elegant way I have seen is to migrate your git repo based on git-lfs official tutorial . the only bad part from migration is the commit history of the repo would be rewritten, but the biggest benefit is that the repo would be much much smaller than the original one as the big files are totally removed from history and moved to lfs server.

the other post suggests git reset to go back to the previous commits where you checked in the big files, That hack works if you have a small repo and knows clearly where those big files are committed. However, it shouldn't be a scalable solution for big repo.

Zhong Hu
  • 272
  • 2
  • 5