4

I have set up Dynamo table and have enabled stream and also enabled TTL (timetolive) on one of the columns. I also have one lambda which will pull entry from Dynamo Stream.

Now either I add, delete, or edit, Or TTL gets expired - all this will cause the lambda invocation.

I am not interested in add or edit event, I only want stream to receive the deleted, TTL expired entries, is this possible?

Also, I can definitely put a check in my lambda code and process only when event type of "delete", but still lambda invocation for add, edit will take place regardless. Kindly guide

Unbreakable
  • 7,776
  • 24
  • 90
  • 171

3 Answers3

2

You can't control DynamoDB stream, it will always post events for any changes happened to your table, however you can control the lambda invocation by adding property "FilterCriteria" in your "EventSourceMapping"

https://docs.aws.amazon.com/lambda/latest/dg/invocation-eventfiltering.html

FilterCriteria: {"Filters": [{"Pattern": "{ \"userIdentity\": { \"type\": [ \"Service\" ] ,\"principalId\": [\"dynamodb.amazonaws.com\"] }}"}]}

using above filter your lambda will be only invoked if TTL expiry event posted in the DymnamoDB stream.

m4c_4rthur
  • 53
  • 7
1

Sadly, you can't make DynamoDB stream, to stream only deletion or expiration of items. Everything is streamed, and its up to your lambda function to filter the events of interests.

For TTL expired items, your function needs to check:


        "userIdentity":{
            "type":"Service",
            "principalId":"dynamodb.amazonaws.com"
        }

An alternative way, is to have second table, only with TTL markers. This could be useful, if your primary table experiences a lot of updates and modifications. This way, the stream on your second table would only invoke your function twice for each item, i.e. creation and TTL expiration, rather then for all the updates you are not interested in.

Marcin
  • 215,873
  • 14
  • 235
  • 294
  • Good Idea but still lambda invocation will take place right? All my tables are connected to single Lambda. I will make sure that anything apart frrom delete and expiry is a NO - OP but still eveerything will eat the bandwidth towards my lambda invocation. – Unbreakable Oct 29 '20 at 15:40
  • 1
    @Unbreakable Yes, your function will be invoked for every change in the table. But you have 1 million free invocations per month, so depending on your workload, that maybe enough for your use-case and it will not cost you? – Marcin Oct 29 '20 at 22:16
  • One last question - How can I distinguish between deleted event and a event which got deleted via TTL. Is there some flag which I can check? – Unbreakable Oct 30 '20 at 14:59
  • 1
    @Unbreakable The TTL deleted records will have `userIdentity` as given in my answer with `"principalId":"dynamodb.amazonaws.com"`. – Marcin Oct 30 '20 at 22:09
0

Haven't found a way to trigger lambdas only on TTL expiracy, but you can recon TTL from the event record payload you receive, checking the properties eventName + userIdentity as said above

 Records: [
    {
      eventID: '36df5f15e7429cc986999f68349e6fef',
      eventName: 'REMOVE',
      eventVersion: '1.1',
      eventSource: 'aws:dynamodb',
      awsRegion: 'us-west-2',
      dynamodb: [Object],
      userIdentity: [Object],
      eventSourceARN: 'arn:aws:dynamodb:us-west-2:XXXXX:table/table-with-ttl/stream/2021-09-27T17:03:42.812'
    }
  ]
if (record.eventName === 'REMOVE') {
      // Check if deletion was done manually or triggered by the TTL timer.
      // https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/time-to-live-ttl-streams.html
      if (
        record.userIdentity &&
        record.userIdentity.type === 'Service' &&
        record.userIdentity.principalId === 'dynamodb.amazonaws.com'
      ) {
        const itemThatWasRemoved = unmarshall(record.dynamodb.OldImage)
        // Your code that only runs for TTL removals here
      }
lele
  • 442
  • 6
  • 14