0

Purpose: I created an SQS queue in hopes of storing bad data into a dead letter queue. My goal is to connect SQS to SNS, so that subscribers can get notified of the bad data stored in the dead letter queue.

However, I understand now that data cannot be permanently stored through SQS because of its retention period. I also understand that SNS is known for PUSH not PULL, therefore it can't receive the SQS messages and email those messages to subscribers

So, my question is, what is a way to keep the messages from SQS permanently? My goal is to store those messages somewhere safely for future use. I also looked into storing the messages into S3.

ssji
  • 1
  • 1
  • 2
    Well, S3 or DynamoDB or any persistent storage should work. But you seem to know that already. Is there a particular question here? – jarmod Aug 02 '22 at 21:20
  • If you want to pull from SQS and publish to SNS then see [here](https://stackoverflow.com/questions/48798751/can-aws-sqs-publish-to-sns-or-is-polling-sqs-required). – jarmod Aug 02 '22 at 21:26
  • @jarmod Thank you! I've been confused on how to send those sqs messages into s3. After googling, I got a bit confused as well. So, I wanted to confirm on here whether this is possible solution. Also, I'll try that link you sent – ssji Aug 02 '22 at 21:40
  • There's no AWS-native capability to move messages from SQS to S3 afaik, but if I'm reading your question correctly, it sounds like your messages may not originally be in SQS at all. Is that right? They're somewhere else but you were thinking of sending them to SQS for persistence but that won't work so now you're thinking of sending them to SQS and from there either to S3 or to subscribers via SNS. Yes, you could do both quite easily but you could also just send them to S3 directly from wherever they are currently (in your app?) at the point you detect they are bad. – jarmod Aug 03 '22 at 00:11
  • @jarmod Okay, that's interesting! My messages are currently only in SQS, however I am trying to move the messages somewhere (S3 or SNS). My goal is to treat the S3 or SNS notification as a dead letter queue. – ssji Aug 03 '22 at 03:25
  • @jarmod But now I'm thinking that maybe I shouldn't use SQS for the purpose of a dead letter queue? It will be easier to directly send the bad data to an s3 bucket? I assumed it was best practice to use an AWS service such as SQS to keep the bad data. – ssji Aug 03 '22 at 03:32
  • Note that DLQ is a supported feature of SQS. You don't typically explicitly create a queue for the purpose of making a DLQ (though you can, if you want). Instead you simply configure an existing SQS queue to have an associated DLQ and then messages that fail to be processed N times from the regular SQS queue will be automatically moved to the DLQ. The max retention period for SQS queues is 14 days. Wherever your messages are currently (in an SQS queue or not) you can write code to move them to S3 (or elsewhere). The earlier link included some options. – jarmod Aug 03 '22 at 13:05

1 Answers1

1

You can permanently persist all messages sent to a SQS queue by adding a SNS topic in front of it that has an additional subscription. This would be a Kinesis Data Firehose subscription that has a S3 bucket configured as it's destination.

That way, any message published to the SNS topic would get sent both to the S3 bucket and the SQS queue. Your SQS DLQ would work the same way as usual, and you could provide a way for clients to query failed messages from the S3 bucket at a later date (potentially much later than the maximum retention policy SQS allows).

See the below links for more information:

bjrnt
  • 2,552
  • 3
  • 27
  • 38