1

I have an ApplicationStack which created a S3Bucket:

export class ApplicationStack extends Cdk.Stack {
  public readonly websiteBucket : S3.Bucket;
  constructor(scope: Construct, id: string, props: ApplicationStackProps) {
        super(scope, id, props);
   // Amazon S3 bucket to host the store  website artifact
    this.websiteBucket = new S3.Bucket(this, "eCommerceWebsite", {
      bucketName: `${props.websiteDomain}-${account}-${region}`,
      websiteIndexDocument: "index.html",
      websiteErrorDocument: "error.html",
      removalPolicy: Cdk.RemovalPolicy.DESTROY,
      autoDeleteObjects: true,
      accessControl: S3.BucketAccessControl.PRIVATE,
      encryption: S3.BucketEncryption.S3_MANAGED,
      publicReadAccess: false,
      blockPublicAccess: S3.BlockPublicAccess.BLOCK_ALL,
    });

    // Create a dummy export.
    // https://www.endoflineblog.com/cdk-tips-03-how-to-unblock-cross-stack-references
    this.exportValue(this.websiteBucket.bucketArn);
    ...
    ...
    ...
  }
}

I also defines ApplicationStage which contains above ApplicationStack

export class ApplicationStage extends Cdk.Stage {
    public readonly websiteBucket : S3.Bucket; 
    constructor(scope: Construct, id: string, props: ApplicationStageProps) {
        super(scope, id, props);

        const applicationStack = new ApplicationStack(this, `eCommerceDatabaseStack-${props.stageName}`, {
            stageName: props.stageName,       
            websiteDomain: props.websiteDomain,
        });
        this.websiteBucket = applicationStack.websiteBucket;
    }

    public getWebsiteBucket() {
        return this.websiteBucket;
    }
}

In my pipeline stack, I want to create application stage for each stage that need deploy the website artifact to its corresponding S3 bucket. This is a cross-account CI/CD pipeline, and I have 3 separate AWS accounts(Alpha, Gamma, Prod) for this website. Whenever I ship code out, the pipeline should deploy the new artifact to Alpha then Gamma then Prod, and the alpha.ecommerce.com, gamma.ecommerce.com, ecommerce.com should be updated in this order. The problem happens when reference the S3Bucket in S3DeployAction below:

export class CodePipelineStack extends CDK.Stack {
  constructor(scope: CDK.App, id: string, props: CodePipelineStackProps) {
    super(scope, id, props);
    ...
    ...

    // Here the pipelineStageInfoList contains Gamma and Prod information.
    pipelineStageInfoList.forEach((pipelineStage: PipelineStageInfo) => {
      const applicationStage = new ApplicationStage(this, pipelineStage.stageName, {
        stageName: pipelineStage.stageName,
        pipelineName: props.pipelineName,
        websiteDomain: props.websiteDomain,
        env: {
          account: pipelineStage.awsAccount,
          region: pipelineStage.awsRegion,
        },
      });
      const stage = pipeline.addStage(applicationStage);


      // Here is what went wrong. It is trying to deploy the S3Bucket for that stage. 
      stage.addAction(
        new codepipeline_actions.S3DeployAction({
          actionName: "Deploy-Website",
          input: outputWebsite,
          bucket: applicationStage.getWebsiteBucket(),
        })
      );
    });
  }
  ...
  ...
  ...
}

Run cdk synthesize got below error:

/Users/yangliu/Projects/eCommerce/eCommerceWebsitePipelineCdk/node_modules/aws-cdk-lib/core/lib/deps.ts:39
    throw new Error(`You cannot add a dependency from '${source.node.path}' (in ${describeStage(sourceStage)}) to '${target.node.path}' (in ${describeStage(targetStage)}): dependency cannot cross stage boundaries`);
          ^
Error: You cannot add a dependency from 'eCommerceWebsitePipelineCdk-CodePipeline-Stack' (in the App) to 'eCommerceWebsitePipelineCdk-CodePipeline-Stack/ALPHA/eCommerceDatabaseStack-ALPHA' (in Stage 'eCommerceWebsitePipelineCdk-CodePipeline-Stack/ALPHA'): dependency cannot cross stage boundaries

I think this means that I didn't pass the S3Bucket reference in the right way here.

How to fix it?

Update with my solution 2022-09-06

Based on matthew-bonig@'s advice, I am able to get this work.

I have a separate stack to be deployed to each account to create the S3 buckets and its required CloudFront distribution. Then my pipeline stack just focus on tracking my GitHub repository, and update S3 bucket whenever a new commit is pushed.

Yang Liu
  • 541
  • 9
  • 26

2 Answers2

3

This usually occurs because your pipeline is running in a different account/region than your stacks created from the pipelineStageInfoList.

If they aren't in the same account/region then then simplest route is to manually set the s3 bucket names by a property on your 'InfoList' and forgo using references like you're trying to use. So you'd have to deploy everything first, then come back with an update afterwards that sets those values.

If they are, then you can try to set the pipeline stacks account/region directly like you are with the other stacks and that might help.

Matthew Bonig
  • 2,001
  • 2
  • 20
  • 40
3

In addition to @Matthew's answer, this can also happen if you've inadvertently included application logic in your CDK pipeline logic. When the pipeline is deployed, your pipeline (which is created first) is deployed with a resource(s) that your pipeline does not have access to, because the application has not been deployed by the pipeline yet. In other words, the pipeline steps are "out of order."

For instance, this could happen if you've created a post step that includes logic from your stage (like passing a Vpc that gets created in your stage into your post step).

Even doing this implicitly (like creating a construct containing multiple CDK services, and passing variables between those services, that will later be passed into stage and post stage steps in your pipeline) will throw dependency cannot cross stage boundaries errors.

At first, this seems counterintuitive because it appears it should be simple (and logical) to pass a resource from your stage into your post stage step.

But, because of the way CDK Pipelines deploys your pipeline and application, this does not work as you might expect.

Instead, you have three options (that I'm aware of):

  1. Use ssm parameters or secrets to pass data between stages. Note: one potential drawback is that you'll need to deploy those secrets or parameters before you deploy and run your pipeline. (Please let me know if I'm mistaken here – I've experienced this for ssm parameters).

  2. Export the data using CfnOutput and import it using Fn.importValue.

  3. Create a Custom Resource. If you need a long running async implementation to perform a task before your next stage can occur, use the Provider framework. Note: as of this writing, the framework only supports 1 hour timeouts. If you need something that runs longer, you'll need to create a StateMachine yourself.

Related resources:

bwl1289
  • 1,655
  • 1
  • 12
  • 10