My service needs some large files when it is running (~ 100MB-500MB) These files might change once in a while, and I don't mind to rebuild my container and re-deploy it when it happens.
I'm wondering what is the best way to store it and use it during the build so anyone in the team can update the container and rebuild it.
My best idea so far is to store these large files in git LFS in a different branch for each version. So that I can add it to my Dockerfile:
RUN git clone -b 'version_2.0' --single-branch --depth 1 https://...git.git
This way, if these large files change, I just need to change the version_2.0
in the Dockerfile, and rebuild.
Is there any other recommended way? I considered storing these files in Dropbox, and just get them with a link using wget
during build
P.S - These large files are the weights for some Deep-Network
Edit - The question is what a reasonable way to store large files in a docker, such that one developer/team can change the file and matching code, and it will be documented (git) and can easily be used and even deployed by another team (for this reason, just large files on the local PC ir bad, because it needs to be sent to another team)