Need to setup infrastructure for a new project. Previously i have used puppet standalone with jenkins, but now i'm thinking about incorporating docker builds, so that i could push from dev to stage'ing to production without triggering a build, but by simply fetching docker existing docker images that have been already built.
The app:
- Java web app with rest api backed by postgresql, neo4j, elasticsearch
- Client side app written with angular that talks to java through rest api
- Code stored in git repositories
Envs:
- Dev server (building, dev + test environments) - 32GB linux machine
- Test server (AWS)
- Production (AWS)
Setup:
So basically i was thinking something like this:
- Separate Docker images for java + cient side app, postgresql, elasticsearch, neo4j that talk to each other and have their data stored on hosts through Docker volumes, or by using Docker data containers (have not decided on the approach yet)
- Jenkins building all the code and creating Docker images that would be pushed to private internal repository
- Integration tests run with Puppet docker module on DEV server
- Push to production with jenkins via puppet by using Docker
Why should i use docker?
- Big dev machine - could easily run multiple instaces of my app without the need of virtualization (could have an unstable dev, stable dev, sit, etc.)
- Ease of deployment (use docker and puppet docker module) and rollback (simply retrieve the previous version from Docker repository)
- Quick migration and ability to spawn new instances
- Preparation for easy scaling of different parts of the system (eg. clustering elasticsearch)
Questions
- Does this look reasonable?
- I'm thinking about using this puppet module https://github.com/garethr/garethr-docker. How would update my environments via it? I must somehow stop the docker container, do an docker rm, and then docker run ?
- We're using liquibase for database update management. Guess this should go separetly from docker for updates/rollbacks?
Any suggestions welcome, thank you.