I am tar
ing and then compressing a bunch of files&directories on my Ubuntu Server VPS for a backup. It only has 1GB of RAM and 128MB of Swap (I can't add more - OVH use OpenVZ as their virtualisation software), and every time tar
runs it uses a ton of memory for it's buffer, causing everything else to get swapped out - even when using nice -n 10
.
Is there any way to force tar
to use a small buffer and reduce it's memory usage? I am worried that once the backup gets to be a certain size, my server will go down because tar
won't have enough memory for it's buffer.
I am using bzip2
to compress, and I have already limited it's memory usage with the -4
option.
Edit:
Here is what htop
looks like when I have had tar
running for a while:
Here is a link to the full gif
Edit 2: Here is the tar command I am using:
nice -n 20 tar --exclude "*node_modules*" --exclude "*.git/*" --exclude "/srv/www-mail/rainloop/v*" -cf archive.tar /home /var/log /var/mail /srv /etc