4

I am at 98% of an 8GB SSD, and I have a 3 GB mysql data that I need to gzip and or tar and then download it, so then I can delete it.

How can I tar or gzip the sql (or the mysql tables) so that that I don't have to make a copy of the 3GB file and get disk full errors.

Running Debian 6.

ParoX
  • 302
  • 1
  • 7
  • 21
  • Can you mount some network file system and put it there? – cjc Jan 08 '13 at 21:57
  • 1
    You do know that tar doesn't compress the file, right? Gzipped file will be smaller than the original, but a tar file will be bigger (unless the tar file is zipped). – Andy Lester Jan 08 '13 at 22:00
  • Yes I am aware, but if I tar the mysql /var/lib folder then I can just mv it to /var/www and wget it from another server. – ParoX Jan 08 '13 at 22:13
  • 7
    Since you have access to another server, why don't you just `rsync` or `scp` to that other server without moving files around? (Or creating a new archive.) – Aaron Copley Jan 08 '13 at 22:34
  • Transferring that much data to your average 4 GB stick would probably be painfully slow. And external drive would be better if he has a decent interface if the add-a-drive route is preferable. – Bart Silverstrim Jan 11 '13 at 13:56

1 Answers1

13

Use the mysqldump utility to direct the file wherever you like:

mysqldump -A -u[username] -p[password] > /path/to/dest/backupname.sql

If you need to, you can pipe the output through gzip:

mysqldump -A -u[username] -p[password] | gzip -c > /path/to/dest/backupname.gz

Further, you can send the output from gzip to another server via ssh:

mysqldump -A -u[username] -p[password] | gzip -c | ssh user@192.168.0.1 'cat >  /path/to/dest/backupname.gz'
nedm
  • 5,630
  • 5
  • 32
  • 52