4

I have a DynamoDB table and whenever a new record is added, I want to archive the old data to S3. So I thought I could use AWS Lambda. So the lambda function will get the new record that is newly added/modified. But I want to pass(to the lambda function) an additional parameter of the s3 path to which the record has to be uploaded.

One way is to have whatever I want to pass to the lamda function in another table/s3. But this(the parameter) will change as each record is inserted into the main table. So I can't read this from my lambda function. (By the time the lambda function gets executed for the first inserted record, few more records would have been inserted)

Is there a way to pass params to the lambda function?

P.S: I want to execute the lambda asynchronously.

Thanks...

Thiyagu
  • 17,362
  • 5
  • 42
  • 79

2 Answers2

0

why not to add this parameters (s3 path) to your dynamodb table (where the new raw is added -not in another table, but at the same table that lambda is listening on)

Eyal Ch
  • 9,552
  • 5
  • 44
  • 54
0

You can now accomplish this by:

  1. Attaching a DynamoDB Stream to your Dynamo Table with a view to NEW_AND_OLD_IMAGES
  2. Creating an event source on your lambda function to read the DynamoDB stream
  3. Add an environment variable to your lambda function to indicate where to write the data to in S3

You'll still have to derive the details of where to store the record from the record itself, but you can indicate the bucket or table name in the environment variable.

Ryan Gross
  • 6,423
  • 2
  • 32
  • 44