0

I triggered an auto scaling group (from launch configuration) with a customised Debian9 image in AWS where I have preinstalled software packages like Apache, Memcache, Chrony along with PHP code base and other required packages. This just works fine until I change the PHP codebase. Everyday at least once PHP codebase is modified manually on the server.

Whenever auto scaling is triggered based on the pre-defined metrics(like memory load, cpu and so on) I would like to spin up one more server. This newly created server must have those updated code base so that both servers are in sync.

Question: How can that be achieved without just copying files from one server to another when the second server goes up and running?

One way could be copying the files from /var/www/* using rsync from old server to the newly created server. I believe this not the best solution to do it. Taking snapshots in each hour is not an option since it will increase the operational costs.

What would be the best way to update the custom image whenever php codebase changes? Could someone suggest/recommend me the best way to do? I believe in this community there are experts who have done similar kind of stuff in AWS.

Tim
  • 31,888
  • 7
  • 52
  • 78
bhordupur
  • 115
  • 2
  • 5

3 Answers3

1

Put your PHP code on Amazon S3. Stop updating the EC2 instance manually.

When your EC2 instance launches, have it download the PHP code from S3. This way, all newly launched EC2 instances will have updated code.

When you update your PHP code:

  1. Update your PHP code in S3,
  2. Launch a new EC2 instance (it will get the new code),
  3. Delete the old EC2 instance.

Another option is to use Elastic Beanstalk for your PHP application instead of managing the EC2 instances yourself.

Matt Houser
  • 10,053
  • 1
  • 28
  • 28
0

If you have the code in a repo then it makes sense to get the code directly from the repo. This approach has benefit in comparison to have the code on S3: you'll pull only changes since the time when the AMI was created instead of the whole directory of the project. Faster and cheaper.

Putnik
  • 2,217
  • 4
  • 27
  • 43
  • I have the code in a repo in SVN but no automated build is in place yet.Planning to build a CI/CD pipeline. Right now doing a PoC basically before moving the prod to AWS – bhordupur Jan 15 '18 at 22:06
  • @bhordupur then don't forget to check CodePipeline, may come in handy. – Putnik Jan 16 '18 at 13:46
0

There's a quite a few ways you could do this:

  • Put all your code on EFS, and mount that on your servers (easiest, but performance may suffer if you are using a large/complex app that includes a lot of modules - you can use php's opcache to mitigate this).
  • Bake your code into a new AMI on each release
  • Do a git pull on bootup
  • Setup codebuild to build the code and upload to S3, and use CodePipeline to deploy it to your EC2 instances. You can automate this to run automatically on every github push (here's an article I've written on using Codebuild with PHP )
  • Use docker and ECS rather than EC2 (especially the new ECS Fargate, which will let you ditch the EC2 instances entirely)

If your app uses sessions you'll need to share them between your instances - the best way in our experience is to use DynamoDB

womble
  • 96,255
  • 29
  • 175
  • 230