7

I currently have a dynamodb table that's been in use for a couple of years, originally created in the console. It contains lots of valuable data. It uses a stream to periodically send a snapshot using a lambda trigger of the table to s3 for analytics. The table itself is heavily used by end users to access their data.

I want to migrate my solution into CDK. The options I want to explore:

  1. When you use the Table.fromTableArn construct, you don't get access to the table stream arn so it's impossible to attach a lambda trigger. Is there a way around this?
  2. Is there a way to clone the dynamoDB table contents in my CDK stack so that my copy will start off in exactly the same state as the original? Then I can add and manage the stream etc in CDK no problem.
  3. It's worth checking my assumption that these are the only 2 options.
monkey
  • 1,213
  • 2
  • 13
  • 35
  • Base on the experience of working with RDS, I would like to suggest creating the snapshot/backup of your DynamoDB, and then create a new DynamoDB base on a snapshot from AWS-CDK. Maybe then you would get more control for the DynamoDB. – elbik Mar 18 '21 at 06:24
  • Just hit this same thing myself, and it looks like you can use `dynamodb.Table.fromTableAttributes` and pass in `tableStreamArn`. Then the construct will know that streaming is enabled. – Ilkka Apr 21 '21 at 11:39
  • Reg. 1: Indeed, `Table.fromTableName` does not give you access to the stream, you have to use `Table.fromTableAttributes` and specify `tableStreamArn`. Here's a comment in the corresponding Github issue that might be helpful: https://github.com/aws/aws-cdk/issues/14046#issuecomment-818556904 – dwytrykus Aug 22 '22 at 10:13

1 Answers1

9
  1. Subscribing Lambda to an existing Dynamo Table:

We don't need to have table created within same stack. We can't use addEventSource on lambda but we can use addEventSourceMapping and add necessary policies to Lambda, which is what addEventSource does behind the scenes.

const streamsArn =
  "arn:aws:dynamodb:us-east-1:110011001100:table/test/stream/2021-03-18T06:25:21.904";
const myLambda = new lambda.Function(this, "my-lambda", {
  code: new lambda.InlineCode(`
  exports.handler = (event, context, callback) => {
    console.log('event',event)
    callback(null,'10')
  }
    `),
  handler: "index.handler",
  runtime: lambda.Runtime.NODEJS_10_X,
});

const eventSoruce = myLambda.addEventSourceMapping("test", {
  eventSourceArn: streamsArn,
  batchSize: 5,
  startingPosition: StartingPosition.TRIM_HORIZON,
  bisectBatchOnError: true,
  retryAttempts: 10,
});
const roleUpdates = myLambda.addToRolePolicy(
  new iam.PolicyStatement({
    actions: [
      "dynamodb:DescribeStream",
      "dynamodb:GetRecords",
      "dynamodb:GetShardIterator",
      "dynamodb:ListStreams",
    ],
    resources: [streamsArn],
  })
);
  1. Importing existing DynamoDb into CDK:

We re-write dynamo db with same attributes in cdk, synth to generate Cloudformation and use resource import to import an existing resources into a stack. Here is an SO answer

Balu Vyamajala
  • 9,287
  • 1
  • 20
  • 42
  • 1
    Awesome! So no need to touch the dynamoDB. That's so well described, even I can understand it. – monkey Mar 18 '21 at 07:47