0

I have a linux server with a lot of unused files, but it is low on disk space. I'd like to take a copy of the files before deleting them. There is not enough disk space to store the compressed version before downloading.

I've tried scp -C hostname:/path . which is streaming down the files with compression but there are lots of tiny files which is taking a long time to copy down.

Another thread suggested a command like ssh 10.0.0.4 "cat /tmp/backup.sql | gzip -c1" | gunzip -c > backup.sql but that only works for one file.

Are there other methods to achieve it?

NoChecksum
  • 125
  • 2
  • 8

1 Answers1

2

How about

tar cf - /source | ssh 10.0.0.4 "gzip > /destination/foo.tgz"

but anything involving many small files will take a fair bit of time, moreso if they're in a shallow directory structure; it's just the nature of the beast.

MadHatter
  • 79,770
  • 20
  • 184
  • 232