0

Backstory:

  • The server is running Ubuntu 14.04.
  • We want to download and put on an external hard drive everything in Apache's document root as we are working on a complete redesign of the web site. (67 Gb of data).
  • A first download was made with FTP but some files ended up being corrupted or plain missing. We found out the hard way.

I was planning this course of action:

  • First compress the whole document root in one big file (with tar).
  • Calculate a checksum of the one big file (with cksum).
  • Download the big file with SSH/SCP.
  • Validate checksum of received file.

The server is currently running and my main worry is that intensives processes like compression or checksum calculation overwhelms the server and subsequently makes Apache hang or crash. Is such worry warranted? What can I do to make sure this does not happen?

lampyridae
  • 103
  • 3

1 Answers1

1

Well sure, any activity on the server will have an effect on other processes. Whether or not this will cause an adverse affect on your webserver is something that only you can determine, via testing. To directly answer your question, run the intensive processes with nice and ionice to reduce their priority below that of your apache processes.

Honestly, though, this is completely unnecessary. Just copy the document root via rsync. Rsync can be run multiple times, will pick up where it left off if interrupted, and can use checksums to verify that destination files match source files.

EEAA
  • 109,363
  • 18
  • 175
  • 245