0

I'm using s3cmd to backup my databases to Amazon S3, but I'd also like to backup a certain folder and archive it.

I have this part from this script that successfully backups the databases to S3:

# Loop the databases
for db in $databases; do

  # Define our filenames
  filename="$stamp - $db.sql.gz"
  tmpfile="/tmp/$filename"
  object="$bucket/$stamp/$filename"

  # Feedback
  echo -e "\e[1;34m$db\e[00m"

  # Dump and zip
  echo -e "  creating \e[0;35m$tmpfile\e[00m"
  mysqldump -u root -p$mysqlpass --force --opt --databases "$db" | gzip -c > "$tmpfile"

  # Upload
  echo -e "  uploading..."
  s3cmd put "$tmpfile" "$object"

  # Delete
  rm -f "$tmpfile"

done;

How can I add another section to archive a certain folder, upload to S3 and then delete the local archive?

Max
  • 841
  • 1
  • 12
  • 25

1 Answers1

0

Untested and basic but this should get the job done with some minor tweaks

# change to tmp dir - creating archives with absolute paths can be dangerous
cd /tmp

# create archive with timestamp of dir /path/to/directory/to/archive
tar -czf "$stamp-archivename.tar.gz" /path/to/directory/to/archive

# upload archive to s3 bucket 'BucketName'
s3cmd put "/tmp/$stamp-archivename.tar.gz" s3://BucketName/

# remove local archive
rm -f "/tmp/$stamp-archivename.tar.gz"