I have a durable functions app, running on a premium elastic service plan in Azure, wherein I
- (a) perform a one off task that returns a potentially large number of results
- (b) run some independent processing on each result from part (a)
Part (a) relies on an external database, which starts rejecting requests when I hit a certain number of concurrent requests.
Part (b) doesn't have such a third party dependency, and should theoretically be able to scale indefinitely.
I'm aware of the ability to place limits on:
- The maxmimum number of instances my service plan will scale out to
- The number of concurrent requests per-instance
- The number of concurrent activity functions
However, using any of these options to limit (a) would also limit (b), which I'd like to leave as concurrent as possible.
Is there a way I can limit the number of concurrent invocations of activity function (a), without placing restrictions on the number of invocations of (b)?
(If all else fails I can track the number of current executions myself in storage as part of running activity (a), but I'd much prefer to either configure this, or be able to drive it from the durable functions framework, if possible - as it is already tracking the number of queued activity functions of each type.)