Backstory:
- The server is running Ubuntu 14.04.
- We want to download and put on an external hard drive everything in Apache's document root as we are working on a complete redesign of the web site. (67 Gb of data).
- A first download was made with FTP but some files ended up being corrupted or plain missing. We found out the hard way.
I was planning this course of action:
- First compress the whole document root in one big file (with tar).
- Calculate a checksum of the one big file (with cksum).
- Download the big file with SSH/SCP.
- Validate checksum of received file.
The server is currently running and my main worry is that intensives processes like compression or checksum calculation overwhelms the server and subsequently makes Apache hang or crash. Is such worry warranted? What can I do to make sure this does not happen?