0

I'm considering to use step functions for scenarios like retrieving data from a few sources, composing a mail with it and sending that mail - with appropriate retries. The step function is triggered from API Gateway, so by calling "StartExecution".

It works great, but I notice that the soft limit for StartExecution is set to 2 executions/second, with a bucket size of 100. I'm expecting tens or maybe hundreds of requests per second...

Am I right to understand that every call to the API counts towards that Start Execution soft limit? Does this low default limit indicate that Step Functions may not be the right tool for the job?

Free Willaert
  • 1,139
  • 4
  • 12
  • 24

3 Answers3

0

You are correct that each API call counts towards this limit. The fact that the bucket size is 100 leads me to believe that higher rates can be supported, but I do not work on the Step Function team.

It is a soft limit, so I would encourage you to request a limit increase. The Step Functions team will be able to tell you if they can support your use case.

Bob Kinney
  • 8,870
  • 1
  • 27
  • 35
0

You should contact AWS support to increase it if your application requires a higher limit.

If they are not able to increase the limit, you could deploy to multiple regions and load balance between them. These limits are per region.

Murali Allada
  • 21,718
  • 4
  • 19
  • 21
0

Eventually I contacted AWS Support, asking

Can we assume that a future Service Limit request to increase the bucket or refill size, e.g. first to 100 executions/second, later to maybe 500 executions/second, will be fulfilled? Would there be any concerns?

The answer:

Yes we do support 500 executions/s, however since the load profile of different types of executions vary so widely it's hard to be specific about this customers executions. Furthermore, as Step Functions grows and each customer has less impact on the larger service it will be easier to increase limits.

So we're good :)

Free Willaert
  • 1,139
  • 4
  • 12
  • 24