I am attempting to set up an automatic backup for a server running Ubuntu. This server is used to host a couple applications which are rather write heavy and subject to a good amount of traffic. To be specific, a Solr and a MySQL database is being maintained and is constantly updated/written to. The amount of data on the server is currently in the range of 20GB, but within a year it should grow to be at least 5TB. So there isn't a lot of data on the server right now, but the solution I come up with should be scalable in the future.
I want to set up a backup of the whole server including the Solr & MySQL indexes, but I am unsure about how to go about it. So far I've set up Tartarus to do an incremental backup, however I've realised that it will be problematic to perform it with only this tool as the server doesn't have LVM enabled. Since LVM isn't enabled I'm getting corrupt/inconsistent backups as files are being written to while being backed up. Is it a correct assumption that I want the ability to make a LVM snapshot of the system and then perform a backup?
I've also entertained the idea that the backup should be split into several parts. One part responsible for handling Solr, one handling MySQL, and one handling the rest of the system. Is this a viable option?
-- UPDATE--
I ended up opting for the approach where I split the backup. I do the Solr backup using replication and then ftp to the backup server. For MySQL I use a mysqldump for now and ftp that. For the rest of the data I use Tartarus with an incremental backup.