-1

I am trying to have events in a DynamoDB table trigger Lambda function that moves the events into Kinesis Data Firehose. Kinesis then batches the files and send them to an S3 bucket. The Lambda function I am using as the trigger fails.

This is the Lambda code for the trigger:


```
import json
import boto3


firehose_client = boto3.client('firehose')

def lambda_handler(event, context):
    resultString = ""
    for record in event['Records']:
        parsedRecord = parseRawRecord(record['dynamodb'])
        resultString =  resultString  + json.dumps(parsedRecord) + "\n"
    print(resultString)
    response = firehose_client.put_record(
        DeliveryStreamName="OrdersAuditFirehose",
        Record={
            'Data': resultString
        }
)

def parseRawRecord(record):
    result = {}
    result["orderId"] = record['NewImage']['orderId']['S']
    result["state"] = record['NewImage']['state']['S']
    result["lastUpdatedDate"] = record['NewImage']['lastUpdatedDate']['N']
    return result
```

DynamoDB Trigger DynamoDB Table Kinesis Data FireHose Edit: Cloudwatch Log

The goal is to get the lambda function to move events to Kinesis triggered by events in DynamoDB

Edit2: Cloudwatch

BRE-ZUSES
  • 9
  • 3

1 Answers1

0

I'm going to post this as my initial answer, and will edit when you return with the exception from your Lambda Logs.

Edit

The issue is that that you are looking for a key in a dict which does not exist

result["lastUpdatedDate"]

lastUpdatedDate is not inside result. It may be useful to check the contents of the dict by logging it to your logs.

print(result)


There is no need to use Lambda when you want integration between DynamoDB and Firehose. Instead of DynamoDB Streams you can use Kinesis Data Streams which will integrate directly with Firehose without the need for extra code.

DynamoDB -> Kinesis Stream -> Kinesis Firehose -> S3

https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/kds.html

If you really want to use DynamoDB Streams then you can also avoid the Lambda code by using EventBridge Pipes

DynamoDB -> EventBridge Pipe -> Kinesis Firehose -> S3

https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-pipes.html#pipes-targets

Both of the above solutions result in no-code delivery of DynamoDB events to Firehose.

Leeroy Hannigan
  • 11,409
  • 3
  • 14
  • 31
  • Thanks for getting back with me Lee. So have only been using AWS for a couple weeks and the point of me doing these exercises is to get to know how to use AWS. I do not know how to access the logs. If you tell me how to, I will send get the information for you. Thank you. – BRE-ZUSES Jan 01 '23 at 18:36
  • This should help: https://docs.aws.amazon.com/lambda/latest/dg/monitoring-cloudwatchlogs.html – Leeroy Hannigan Jan 01 '23 at 18:55
  • I think I found what you where talking about. Please see my other edit. Thanks. – BRE-ZUSES Jan 01 '23 at 19:34
  • Thanks for the detailed answer. I will come back to this post the next time I design something that moves data around. Something strange did happen and I am wondering if you ever experienced anything like this before. So I went back and fixed the Key name (column name in Dynamo) but the lambda kept failing. Then I came back a day later and without doing a thing, it was working. Do you know why that is? – BRE-ZUSES Jan 05 '23 at 01:38
  • Yes, because you have not set the number of retries on your event source mapping, lambs will indefinitely retry the broken item for 24 hours which is known as a poisoned pill. Configure a set amount of retries and a Dead Letter Queue to avoid this. – Leeroy Hannigan Jan 05 '23 at 08:43