1

I would like to clear that i am not a sysadmin pro in any way, and maybe my question is answered already, but as long as i searched on google/serverfault/SO i didn't find anything similar.

Let's say we have 2 single files that we want to update on one production server. We can do that with ssh-git. AFAIK files get uploaded one by one. Assume now, that we want these two files to be upload at once and by no means we want the user to face the unlucky situation of file1UpdatedVersion and file2OldVersion.

So i assume we have to shutdown the server, upload files, make some tests that everything is working properly and open the server again.

Is there any automation software where we can send a list of files that are for upload, feed it with some test cases where our update will be considered successful,then shutdown-update-start server, providing us in that way the minimum server downtime?

Thanks for your time !

Themis Beris
  • 121
  • 3

1 Answers1

1

There are a lot of ways to automate deployments with minimum downtime. Depending on your current setup, different methods can be applied. A good start could be to make your actual release step to be the change of a symlink.

Let's say you start off with your app v1 located in /var/www/app/v1, then you have a symlink named /var/www/app/current pointing to the v1 folder. You configure your web server to use the current folder as document root. When you're about to release v2, you upload it to /var/www/app/v2 and then change your current symlink to point against v2. Using this method you will get minimal (if any) downtime and users won't end up in a state where different versions are served at the same time.

To avoid doing this manually, there are tools available which does this for you. Here are a few examples:

If you want to take your deploy process a step further, I would recommend looking in to the immutable server concept or tools such as Docker

Bazze
  • 1,531
  • 10
  • 11