0

I want to upload a file to an existing remote repository and use the repo as file storage. The repo will be very big, so i don't want to store it on my pc and also don't want to clone it every time i want to upload to it.

I thought i could just force push a commit with the new file to the remote, but that just overwrites everything. But doing a fetch or pull before just downloads the repo.

Is there any other way to do this or do you have to reverse engineer the push feature and make a custom git client?

Felix
  • 1
  • I suspect you are going to discover that Git is not the best tool for your needs in this case. – TTT Jan 19 '23 at 17:41
  • You could push a new branch for each file, and never fetch. That doesn't require the repo knowing [i.e. downloading] what's on the remote before pushing. Then maybe a dedicated server or CI process that merges the unrelated branches. You could also use `clone --depth=1` to limit the bandwidth needed. But in general, it's a mis-use of git. – amphetamachine Jan 19 '23 at 18:51
  • @amphetamachine I thought about pushing to a new branch for every commit, but it would be very unorganized (did work well tho). To merge the 2 branches you would need to download it first, right? I don't really want to use git, but Azure DevOps gives you unlimited LFS storage so im trying to use it as file storage. Obviously not the intended usage :). – Felix Jan 19 '23 at 20:02
  • You could possibly combine the "new branch for each file" technique with a bare repository and the merge techniques from this question: https://stackoverflow.com/q/7984986/237955 – amphetamachine Jan 19 '23 at 20:11

0 Answers0