1

My use case is as follows:

  1. Github repositories A, B and C have the source code for Lambda. A and B are libraries and C is the application that uses libraries A and B.
  2. GitHub repository X has the AWS CDK source code
  3. Each time a git commit is made to any of A, B, C or X the application C needs to be rebuilt in a Docker and the artifact (the application and the compiled libraries) packaged into a production Docker - the usual multi-stage Docker.
  4. The production Docker needs to be pushed to ECR
  5. Any infra changes due to any changes in X must also now be cdk-synthesised and cdk-deployed.
  6. Finally Lambda needs to use this updated image from ECR which I suppose will automatically happen after cdk-deploy above

I want this to be done via self-mutating CodePipeline via CDK.

I'm unable to figure out the details of how to set up this multirepository cdk code-pipeline up. There is the new(er) aws_cdk.pipelines module with synth property. Am I supposed to sequentially specify all the build steps in the given order above using a single ShellStep ? A ShellStep only gives the ability to specify one input source, whereas I would need all 4 (A, B, C and X).

Eg (python):

aws_cdk.pipelines.CodePipeline(self, 'Pipeline',
   ...,
   synth=ShellStep("<...>",
      input=<only one repo allowed to be specified>

Q1) What is the correct way to achieve the code build and self mutating steps mentioned above?

Q2) Since cdk synth and other builds will likely be run locally before committing to GitHub, I was thinking of creating a docker image with a build environment in which as a part of local build, I could build and test all repositories needed locally. This will give everyone an environment where the compilation and artifact generation just works. Then this Docker dev-image can be pushed to ECR. In this case, the CodePipeline will just read the Dockerfile (present in say X), build A,B and C in a docker and then create a production Docker image with the artifacts as usual (multi-stage Docker builds), push it to ECR, restart the dev-docker again, this time to run the cdk-synth + cdk-deploy, passing in the credentials as environment variables to the dev-docker container. Will this method work? How do I get what credentials to pass via CodePipeline to the docker container?

KnowSQL
  • 85
  • 1
  • 6
  • Pipelines is not set up to be mutli repo source as far as I am aware. You can, however, try putting an s3 bucket in between. Basically, have a series of individual code builds that add their components to a common s3 bucket, each build just being an artifact creator from one repo to the s3 bucket. That bucket could then be the source for your pipeline. It would be able to build your images from there as a single step before your synth – lynkfox Oct 26 '21 at 22:25
  • Also, there is no need to run cdk synth locally- either cdk deploy or a the simple synth stage does this automatically as part of its actions. And yes., as well, it would be many different stages to enact everything together then run your final simple synth. – lynkfox Oct 26 '21 at 22:28

0 Answers0