I have a 3rd party client to which I have exposed my S3 bucket which will upload files in it. I want to rate limit on the bucket so that in case of anomaly at their end I dont receive a lot of file upload requests on my bucket which is connected to my SQS queue and DynamoDB so it will lead to throttling at the DB and in queue as well. Also I would be charged heftily.How do I prevent this?
Asked
Active
Viewed 4,967 times
4
-
How large are these files? – Marcin Jun 04 '20 at 08:59
1 Answers
1
It is not possible to configure a rate-limit for Amazon S3. However, in some situations, Amazon S3 might impose a rate limit when data.
A way to handle this would be to process all uploads through API Gateway and your back-end service. However, this might lead to more overhead and costs than you are trying to save.
You could configure an AWS Lambda function to be triggered when a new object is created, then store information in a database to track the upload rate, but this again would involve more complexity and (a little) expense.

John Rotenstein
- 241,921
- 22
- 380
- 470
-
By using lambda and tracking the upload rate how can stop the upload on s3 then? – Saksham Agarwal Jun 04 '20 at 12:04
-
Good question! I guess it would need to block access somehow, such as changing the permissions associated with the user. How are these users authenticating and how are they uploading? (eg via pre-signed URLs?) – John Rotenstein Jun 04 '20 at 12:12
-
The uploads to the bucket would be happening programmatically using API calls. – Saksham Agarwal Jun 07 '20 at 11:38
-
Going via API Gateway might be the only way to truly control access. There are throttling controls built into API Gateway. – John Rotenstein Jun 08 '20 at 01:01