I've got a Linux server I can only connect to remotely. I want to backup it, but it'll have to be over the Internet, and I've chosen Google Drive to hold the backups. The only piece of puzzle I still don't have is how to package and compress all the files. I want compression, because space on the Google Drive is limited, and it would also reduce upload times.
I could of course use the standard tar+gzip/bzip, or zip, or maybe even something fancy like 7z for best compression.
But what I'm wondering about is this - many of the files that will need to be backed up will be things like JPEG images, which don't compress well at all, no matter which compressor I use. It would be faster if those files were copied to the target archive as-is, rather than compressed. Other files are text files, which compress better with a specialized algorithm (can you tell yet that I'm backing up websites?).
Is there some kind of archiver which recognizes such files (by file extension would be fine) and applies a different algorithm for them? I think I've seen one somewhere, but I don't remember which one it was and if it has a Linux version.
Or perhaps I'm overthinking this?