0

I have a server that holds 3 websites and I have the following bash script that will run the s3cmd sync command daily:

#!/bin/sh
echo 'Started'
date +'%a %b %e %H:%M:%S %Z %Y'
s3cmd sync --recursive --preserve /etc/apache2 s3://mybucket
s3cmd sync --recursive --preserve /var/www/website1.com/public_html s3://mybucket
s3cmd sync --recursive --preserve /var/www/website2.com/public_html s3://mybucket
s3cmd sync --recursive --preserve /var/www/website3.com/public_html s3://mybucket
s3cmd sync --recursive --preserve /var/db_backups s3://mybucket
dpkg --get-selections > dpkg.list
s3cmd sync --recursive --preserve dpkg.list s3://mybucket
date +'%a %b %e %H:%M:%S %Z %Y'
echo 'Finished'

The problem though is that the s3cmd sync command creates the public_html folder and puts all of the 3 websites in to that folder.

What I want to do is have the three sites back up in to separate folders. How can I achieve this with the above commands?

mickburkejnr
  • 3,652
  • 12
  • 76
  • 109

1 Answers1

1

you could save each website into a different location within your bucket

s3cmd sync --recursive --preserve /var/www/website1.com/public_html s3://mybucket/website1/
s3cmd sync --recursive --preserve /var/www/website2.com/public_html s3://mybucket/website2/
s3cmd sync --recursive --preserve /var/www/website3.com/public_html s3://mybucket/website3/
Frederic Henri
  • 51,761
  • 10
  • 113
  • 139