I have a CodeBuild project with buildspec that requires database password value to operate. I want this buildspec to be environment-agnostic, but each environment requires different database password. Database password value for each environment is stored in SSM store under it's own key.
What would be the better approach to pass the database password to the CodeBuild project in this scenario?
Using CodeBuild's env.parameter-store
It seems that recommended approach is to use CodeBuild's built-in solution (env.parameter-store
), but then I will have to load passwords for each environment and then to select one password in the build script:
# Supported Variables
#---------------------
# - ENVIRONMENT
#
version: 0.2
env:
parameter-store:
DB_PASSWORD_PROD: "/acme/prod/DB_PASSWORD"
DB_PASSWORD_STAGE: "/acme/stage/DB_PASSWORD"
DB_PASSWORD_QA: "/acme/qa/DB_PASSWORD"
phases:
build:
commands:
- |-
case "${ENVIRONMENT}" in
"prod") DB_PASSWORD="${DB_PASSWORD_PROD}" ;;
"stage") DB_PASSWORD=${DB_PASSWORD_STAGE} ;;
"qa") DB_PASSWORD=${DB_PASSWORD_QA} ;;
esac
- echo "Doing something with \$DB_PASSWORD…"
This will require three requests to SSM and it makes buildspec more complex. Such approach looks sub-optimal to me.
Maybe there is a way to somehow construct SSM key using ENVIRONMENT variable in env.parameter-store
?
Pass SSM parameters from CodePipeline
The other approach would be to pass the password from the CodePipeline as an environment variable directly to CodeBuild project. This will dramatically simplify the buildspec. But is it safe from the security perspective?
Get SSM parameters manually in CodeBuild script
Would it be better to call SSM from the script manually to load the required value?
# Supported Variables
#---------------------
# - ENVIRONMENT
#
version: 0.2
phases:
build:
commands:
- >-
DB_PASSWORD=$(
aws ssm get-parameter
--name "/acme/${ENVIRONMENT}/DB_PASSWORD"
--with-decryption
--query "Parameter.Value"
--output text
)
- echo "Doing something with \$DB_PASSWORD…"
Is this approach would be more secure?