1

I have a DynamoDB table in Account A and an AWS Lambda function in Account B. I need to trigger the Lambda function when there are changes in the DynamoDB table.

I came across aws lambda - It is possible to Access AWS DynamoDB streams accross accounts? - Stack Overflow which says it is not possible. But again I found amazon web services - Cross account role for an AWS Lambda function - Stack Overflow which says it is possible. I am not sure which one is correct.

Has somebody tried the same scenario as I am trying to achieve?

John Rotenstein
  • 241,921
  • 22
  • 380
  • 470
user2803056
  • 377
  • 4
  • 12
  • I answered that second link and I would have tested it so I'd say, yes, it does work. Have you tried it? What problems are you experiencing? – John Rotenstein Jul 30 '18 at 21:40
  • Please do not take me wrong. I just wanted confirm before I spend some time. The question you have answered was interaction between lambda and route53 and the other link was pointing to official aws documentation which says it is not possible. That is why I posted this question. – user2803056 Jul 31 '18 at 17:41
  • Ah! I just re-read the 2nd link. It is _not_ related to your scenario. It is offering a way that a Lambda function can access resources in a different account. It does _not_ explain how to have DynamoDB trigger a Lambda function in a different account. – John Rotenstein Aug 01 '18 at 03:36

2 Answers2

3

The first link that is being pointed to is correct. Triggers from Stream-based event to Lambda is limited to same aws account and same region.


However, there is a way you will be able to achieve your goal.


Pre-requisite: I assume you already have a Dynamo DB (DDB) table (let's call it Table_A) created in AWS account A. Also, you have a processing lambda(let's call it Processing_Lambda) in AWS account B.


Steps:

  1. Create a new proxy lambda(let's call it Proxy_Lambda) in Account A. This lambda will broadcast the event that it processes.
  2. Enable dynamo stream on DDB table Table_A. This stream will contain all update/insert/delete events being done on the table.
  3. Create a lambda trigger for Proxy_Lambda to read events from dynamo db table stream of Table_A.
  4. Crete new SNS topic (let's call it AuditEventFromTableA) in AWS account A
  5. Add code in Proxy_Lambda to publish the event read from stream to the SNS topic AuditEventFromTableA.


  6. Create an AWS SQS queue (can also be FIFO queue if your use-case requires sequential events). This queue is present in AWS account B. Let's call this queue AuditEventQueue-TableA-AccountA.

  7. Create a subscription for SNS topic AuditEventFromTableA present in AWS account A to the SQS queue AuditEventQueue-TableA-AccountA present in AWS account B. This will allow all the SNS events from account A to be received in the SQS queue of Account B.
  8. Create a trigger for Processing_Lambda present in AWS account B to consume message from SQS queue AuditEventQueue-TableA-AccountA.

Result: This way you will be able to trigger the lambda present in account B, based on the changes in dynamo table of account A.

Note: if your use-case demands strict tracking of the sequential event, you may prefer publishing update events from Proxy_Lambda directly to AWS Kinesis stream present in Account B instead of SNS-SQS path.

mango
  • 548
  • 3
  • 8
  • Why is SQS needed? Can I simply attach the Lambda to the SNS topic? – Wilfred Wee Aug 06 '18 at 22:47
  • SNS is located in AWS Account **A** and SQS is located in AWS account **B**. And due to the existing limitation of the way Lambda works, you will not be able to create a trigger for lambda from any different AWS account – mango Aug 08 '18 at 00:56
  • Is this still the best way in 2021 to trigger a lambda function for cross-account ddb stream? – bappak Apr 27 '21 at 00:17
0

Simple! Create a proxy lambda A in Account A and permit A to call target lambda Bin account B. DDB stream trigger lambda A. Lambda A call Lambda B.

Freeman
  • 1
  • 2