Our current deploy process goes something like this:
- Use
grunt
to create production assets. Create a datestamp and point files at our CDN (eg
/scripts/20140324142354/app.min.js
).Sidenote: I've heard this process called "versioning" before but I'm not sure if it's the proper term.
Commit build to github.
- Run
git pull
on the web servers to retrieve the new code from github.
This is a node.js site and we are using forever -w
to watch for file changes and update the site accordingly.
We have a route setup in our app to serve the latest version of the app via /scripts/*/app.min.js
.
The reason we version like this is because our CDN is set to cache JavaScript files indefinitely and this purposely creates a cache miss so that the code is updated on the CDN (and also in our users' browsers).
This works fine most of the time. But where it breaks down is if one of the servers lags a bit in checking out the new code.
Sometimes a client hits the page while a deploy is in progress and tries to retrieve the new JavaScript code from the CDN. The CDN tries to retrieve it but hits a server that isn't finished checking out the new code yet and caches an old or partially downloaded file causing all sorts of problems.
This problem is exacerbated by the fact that our CDN has many edge locations and so the problem isn't always immediately visible to us from our office. Some edge locations may have pulled down old/bad code while others may have pulled down new/good code.
Is there a better way to do these deployments that will avoid this issue?