I am building a Telegram bot in C#, deployed with AWS Lambda. Telegram bot and Lambda are connected via a webhook and work fine. I need to schedule deleting a bot's message in a few minutes without blocking the bot. It must keep accepting and process new requests.
As for now I see the solution in using Task.Delay
. However, the instance created by AWS to execute lambda doesn't scale and users have to wait until the delay is ended to handle the following request from the queue.
From the official documentation:
The first time you invoke your function, AWS Lambda creates an instance of the function and runs its handler method to process the event. When the function returns a response, it stays active and waits to process additional events. If you invoke the function again while the first event is being processed, Lambda initializes another instance, and the function processes the two events concurrently. As more events come in, Lambda routes them to available instances and creates new instances as needed. When the number of requests decreases, Lambda stops unused instances to free up scaling capacity for other functions.
The default regional concurrency quota starts at 1,000 instances.
As far as I understand the whole Lambda thing is about delegating concurrent execution to AWS. If a handler takes some time to fulfil a request, then AWS automatically creates the second instance to process the following request. Isn't it?
How can I implement concurrency/configure lambda/rewrite code to enable handling multiple bot events?
I've already watched through AWS Step Functions and EventBridges to solve the problem, but before diving deeper into them it would make sense to clarify that there is no a simple and straightforward solution that I missed.
P.S. Please keep in mind that this is my first experience in building a telegram bot and using AWS Lambda functions. The problem may lie completely outside AWS and Telegram Bot API.