I have an AWS Lambda function, which subscribes to a DynamoDB stream and is configured with an SQS dead letter queue (DLQ). I can see that the correct queue is configured in the Management Console. Also I took care to give my function permissions…
My use case is as follows:
I need to push some data into AWS SQS queue using JAVA SDK and by help of IAM role (not using credential provider implementation).
Is there any way to do that?
Thanks for help in advance.
I have a SQS Queue from which messages are read by multiple hosts. I want to run some job (business logic) after all the messages in the queue have been processed.
How can I check that the queue is empty?
Yes, I can check for…
My application sends messages to a AWS SQS queue for jobs that need some sort of background processing. My processing daemon receives and processes the messages like this:
$result = $sqsClient->receiveMessage([
'QueueUrl' =>…
I have a small set of messages in an SQS queue, that are not deleted even though a deletion request sent to the AWS endpoint returns with a 200 response. The messages are processed by my application fine, and the deletion request is sent fine…
I am looking for a sample .Net application that continuously checks Amazon SQS for new messages and when one is found, perform an action and remove it from the queue.
My goal is to have an app running on EC2 that watches my SQS queue for new…
I could have sworn that there was an easy way to change the batch size of an SQS queue that is configured as a Lambda trigger, but as of July 2020 I can no longer find where this happens. This may have something to do with the "new SQS console…
I'm building a Serverless app that defines an SQS queue in the resources as follows:
resources:
Resources:
TheQueue:
Type: "AWS:SQS:Queue"
Properties:
QueueName: "TheQueue"
I want to send messages to this queue from within…
I use Spark 2.2.0.
How can I feed Amazon SQS stream to spark structured stream using pyspark?
This question tries to answer it for a non structured streaming and for scala by creating a custom receiver.
Is something similar possible in pyspark?…
I know that it is possible to consume a SQS queue using multiple threads. I would like to guarantee that each message will be consumed once. I know that it is possible to change the visibility timeout of a message, e.g., equal to my processing time.…
I want to setup the following event/notification configuration in an AWS S3 bucket:
Upon receipt of a file (s3:ObjectCreated:*) two targets shall be triggered:
SQS: To queue the file for detailed post processing for a retention period for a couple…
I have a process that creates SQS messages and places them on an SQS queue and another process that reads those messages and performs certain logic based on the contents of the body and attributes of the message.
I can successfully create a message…
What do people think are the most important issues when developing an application that is going to allow users to upload video and images to a server and have them transcoded by FFMPEG and stored in amazon S3? I have a couple of options;
1) install…
In Amazon Web Services, their queues allow you to post messages with a visibility delay up to 15 minutes. What if I don't want messages visible for 6 months?
I'm trying to come up with an elegant solution to the poll/push problem. I can write code…
I would like to connect an sqs queue to an sns topic that is in a different account, using cdk (typescript). Below is the code (this code is in a stack) that I think should work but I have some doubts listed below the code (I have not deployed this…