8

I have somehow reached the github data limit even though my single remaining large file (138 mb) is now being tracked by LFS (Github's Large File Storage). I reset to the last commit that went through, made sure that LFS was tracking the only (to my knowledge) problem file. I still get the following error, and have no idea what to do.

batch response: This repository is over its data quota. Purchase more data packs to restore access.  
Uploading LFS objects:   0% (0/1), 0 B | 0 B/s, done
error: failed to push some refs to <repo name>

I have used LFS to store multiple large files before with success. I don't know what could be causing this issue.

Jarmos
  • 336
  • 1
  • 8
  • 22
  • Have you tried contacting [Github's support](https://github.com/contact)? – shim Jun 26 '18 at 21:26
  • I contacted them, I'll update this Q when I get a reply. Thanks for the tip. – Robby Costales Jun 28 '18 at 19:00
  • So what did support say? – Wok Jun 01 '19 at 15:58
  • 1
    I had another repository that was using a lot of LFS storage. I deleted the files on my end, but for whatever reason, I was still encountering the error. I emailed Github and one of their representatives told me that they had to delete the files on their end (or something, still confused). Once they dealt with it for me, the error was resolved. – Robby Costales Jun 06 '19 at 18:22
  • See also "[How to check the size of each repositories using Git-LFS?](https://stackoverflow.com/a/69046249/6309)" – VonC Sep 03 '21 at 14:09

1 Answers1

11

To be honest I find the storage terms & conditions on the remote server really weird. GitHub "tracks" your Large File on their infrastructure with a limit of 1 GB for individuals holding a GitHub Free account.

But there's a caveat; say you initialized your local repository with Git LFS, pushed the 138 MB file to the LFS server, all good & dandy. Now if for some reason or the other you decide to change some aspects of the file, even if it's a mere 1 Byte big change, Git LFS tracks it as a totally new file. Hence over time as you keep changing the Large File, it takes up precious storage space on the servers.

Weirdly enough, not even the official documentation mentions any way to remove previous versions of the Large File to free up precious space.

Here's an example detailing exactly what I mentioned:

If you push a 500 MB file to Git LFS, you'll use 500 MB of your allotted storage and none of your bandwidth. If you make a 1 byte change and push the file again, you'll use another 500 MB of storage and no bandwidth, bringing your total usage for these two pushes to 1 GB of storage and zero bandwidth.

You can read more in the official documentation here - Tracking storage and bandwidth use

Jarmos
  • 336
  • 1
  • 8
  • 22