0

I'm using an network external drive to backup my git server (a Debian system, git version 1.7.10.4). In very short the external backup drive is mounted using

mount -t cifs //external/backup/drive/path /media/drive/ -o user=a_user,password=a_password

For new repo, my script does

sudo git clone --mirror file:///new_repo

For existing repos

sudo git remote update

Everything went OK, but recently I started to get error reports on new repos:

fatal: Could not make /media/drive/new_repo.git/refs writable by group

and on some of the old ones

error: insufficient permission for adding an object to repository database ./objects

I have checked consistency of user (root) read/write access among various repo, both on the server and on the backup drive

drwxr-xr-x 0 root root 4096 Nov 10 12:18 repo1.git
drwxr-xr-x 0 root root 4096 Nov 10 12:18 repo2.git
drwxr-xr-x 0 root root 4096 Nov 10 12:20 repo3.git
drwxr-xr-x 0 root root 4096 Nov 10 12:25 repo4.git

and didn't found any issue. I know files are not "group writable" but since it works for 80% of the repos, I suspect the issue is not their.

I have also checked that core.sharedRepository is set to true as suggested in related topics without success.

I suspected different rights between root (the one running the backup script) and the login used to mount the drive. But then why do I get different behavior on various repos??

Please bear with me if my question is not original enough, but I search intensively this site and the rest of the internet without success. This is really starting to drive me nuts.

With best regards

Rheve
  • 1
  • 2
  • This question: http://stackoverflow.com/questions/750765/concurrency-in-a-git-repo-on-a-network-shared-folder addresses the main issues with using a network share to host a repository, and offers some solutions, hope it helps. – orbrey Nov 17 '14 at 15:58
  • Thanks orbrey, I saw this one. However my goal is not to host my repos on a shared drive, but to make a backup copy to respect "business continuity" requirements of my customers. I could do a stupid copy (it works) but it takes ages compared to a "git remote update" :) – Rheve Nov 17 '14 at 17:01
  • In that case I'd suggest setting something up in cron - maybe not just a copy (as you say it'd take ages) but maybe an incremental rsync so only the changes are copied over? You could conceivably set up a hook that would kick it off after each commit but daily (or even hourly, depending on how busy the repo is) rsyncs would keep things up to date. That way you wouldn't have to deal with the possible git/cifs issues. – orbrey Nov 17 '14 at 17:09

0 Answers0