I've read quite a bit on the downsides of storing large binary files in GIT. Usually there is a recommendation to use git LFS and the like. But most of the cautions seem to be regarding files that you are going to change from time to time.
Are there any downsides to storing large binary files in a GIT repo when the files will almost never change?
I'm talking about a 3-5 GB repository holding mostly image data that is meant to be used as input to unit tests. So it has to stay the same.
The typical use-case is a developer does a one time pull of the repo and then almost never needs to pull it again. If they add a new unit test, they might add some more images but that's it. This is how the data has been stored for the past 5 years in our SVN repo and it works fairly well. If I move this to GIT, is there something specific about it that would make this work a lot worse?
(I'm not in love with this model but it would sure make my cloud builds a lot easier...)