3

I'm trying to back up 66 gigabytes to a NAS by making a tarball, but "tar -cSf ..." quits with a "memory exhausted" error after gigabyte 62. My Linux machine has a gigabyte of RAM and a gigabyte of swap space.

(edit) I tried it again about where tar gave up and tar quickly gave up again, so it looks like it may be having trouble dealing with a special file.

This data is surprisingly resistant to being backed up. rsync is 4 times slower than tar because the NAS isn't very fast and it quits in the middle with 'connection reset by peer'; 'cp' doesn't work well on the cifs share because it can't create the special files. Is there a better way?

joeforker
  • 2,399
  • 4
  • 26
  • 35
  • 1
    What is the makeup of this data. Wide directory tree ? Deep ? many thousands of files in a single directory, many millions of files in total ? – Dave Cheney May 15 '09 at 12:16
  • It is an ordinary collection of home directories. – joeforker May 15 '09 at 12:56
  • What does find | wc -l give you. My theory is that an inordinate number of files is causing tar to run out of memory in some internal structure – Dave Cheney May 15 '09 at 13:16
  • Do any of the directories contain special (block or character device) files? – Tim May 15 '09 at 17:26

3 Answers3

5

I don't know why, but I can suggest you try something like

tar -L 32212254720 -f piece1.tar -f piece2.tar -f piece3.tar <path>

which will create one file per every 30 GB. If you ever reach the 120 GB mark, you'd need to add a fourth file (-f piece4.tar)

If this still fails you could try with smaller pieces and writing a script to generate the command line (because a commandline with 80 -f arguments would be a pain to write :-) )

Vinko Vrsalovic
  • 1,523
  • 2
  • 15
  • 20
3

Try passing the --hard-dereference option. This will disable hard link tracking, which requires memory proportional to the number of inodes being backed up. It might also be interesting to try stracing the tar process while it's attempting to back up the problem file.

bdonlan
  • 693
  • 7
  • 14
3

-S is doing some checking for sparse files (those where not all file extents are actually physically allocated on disk). This could possibly be running out of memory. Try running it without the -S (compress it if you really want) and see if this fixes the problem.

tar cf foo.tar *

or

tar czf foo.tar.gz *