1

I am using s3cmd to backup my files from s3, I run the following command from my terminal to sync between my files in my_bucket and my local folder called workfiles which is in the home directory

s3cmd sync s3://my_bucket/ /home/mody/workfiles/ --delete-removed

This is a manual method and it will be painful if the volume is big, so I am wondering if it's possible to do some kind of automatic backup to a server like digitalocean or some other cloud service, and how to do that ? any code snippet will be appreciated! Thanks

medBouzid
  • 7,484
  • 10
  • 56
  • 86
  • 1
    What about setting it up as a cron job? – Uzbekjon Jun 19 '16 at 15:33
  • @Uzbekjon yeah but I don't want to backup files to my local disk! I am asking how to do backups to another web server or another cloud storage – medBouzid Jun 19 '16 at 15:35
  • Just setup a cron job on that server and you're done. Or, am I missing something? – Uzbekjon Jun 19 '16 at 15:40
  • @medBo as part of your question you asked if it was possible to "backup to a server like digitalocean" and setting up a cron job on a digitalocean server to pull from S3 would be the way you would accomplish that. Not sure why you took the suggestion of using a "cron job" as an answer that only applies to "local disk". – Mark B Jun 19 '16 at 15:40
  • @Uzbekjon oh sorry I don't have any idea as this is the first time I am going to do a such thing, I heard before about cron job but I've never used it :) so from what I understand it will be like I am using a local computer! – medBouzid Jun 19 '16 at 15:43
  • @MarkB I thought it would be different that doing that in local !! I have no previous experience with these stuffs really!! – medBouzid Jun 19 '16 at 15:44

2 Answers2

10

Cherry Software have cloud solution to run storage backup that might be of use to you . This is Azure to S3 and back I believe.

https://www.cherrysafe.com/Home/Features#storageBackup

Not tried it myself but it is spin off company from redgate so has good background pedigree.

JamesKn
  • 1,035
  • 9
  • 16
  • 3
    Actually as the original developer of Cherry Safe in Redgate and now it's owner I can confirm that it can do S3 to Azure blob backups too. There are a few restrictions around how metadata is translated as Azure metadata is more restrictive than S3. – Richard Mitchell Jun 20 '16 at 06:53
2

As per our discussion, you may setup a cronjob to schedule tasks on your local/remote server. Different linux distributions might have different crontab files setup for you. For example, according to this article, DigitalOcean has hourly, daily, etc. crontab folders setup for your convenience for their Ubuntu instances.

So, you can create a file with your command to execute in /etc/cron.daily folder:

s3cmd sync s3://my_bucket/ /path/to/backupfolder/ --delete-removed

Make sure to install s3tools on your remote machine as well.

Uzbekjon
  • 11,655
  • 3
  • 37
  • 54
  • Thank you I got it. I will also be able to setup an http server so if s3 goes down I will just replace the path to my images by my digital ocean url, for example if image url is `s3-eu-west-1.amazonaws.com/images/image1.png` then it will be `http://myserver.com/images/image1.png` in urgency case, right ? – medBouzid Jun 19 '16 at 16:22
  • Yeap, but don't forget to install and setup a webserver on that machine as well (nginx would be a good choice for your case). – Uzbekjon Jun 19 '16 at 16:32