My goal is to run a daily backup (overriding the previous day's backup) of specific portions of my database so I could easily download it and import it and have all my important data if anything goes wrong.
I currently have a 20GB web server (Ubuntu) and my database is ~11GB and growing (slowly), so I know I'll need to fire up a second web server to store the backup. (And I'll eventually need to upgrade my primary server once it becomes ~20GB.)
My data is currently set up in a few indexed tables, but I don't need to back up all the data, so I'd like to run a query that selects only what I need and rebuilds a new database (.sql). This would help keep the size down, but the file is still going to be very large, so I'd also like to compress this file, would GZIP be the way to go? This would also neatly package the entire database into one file, which is something I need.
In addition, since I'll probably be using the second server to request the data from the first server, how do I ensure that the request doesn't time out?
TL;DR Need to run a daily back up of an enormous (10+ GB) database onto another server while removing certain tables/columns in the process and compressing to optimize hard disk & bandwidth usage so I can easily download & import the backup (one file) if need be.