4

Using google cloud functions, is there a way to manage execution concurrency the way AWS Lambda is doing? (https://docs.aws.amazon.com/lambda/latest/dg/concurrent-executions.html)

My intent is to design a function that consumes a file of tasks and publish those tasks to a work queue (pub/sub). I want to have a function that consumes tasks from the work queue (pub/sub) and execute the task.

The above could result in a large number of almost concurrent execution. My dowstream consumer service is slow and cannot consume many concurrent requests at a time. In all likelyhood, it would return HTTP 429 response to try to slow down the producer.

Is there a way to limit the concurrency for a given Google Cloud functions the way it is possible to do it using AWS?

simon
  • 559
  • 1
  • 6
  • 19

4 Answers4

4

This functionality is not available for Google Cloud Functions. Instead, since you are asking to handle the pace at which the system will open concurrent tasks, Task Queues is the solution.

Push queues dispatch requests at a reliable, steady rate. They guarantee reliable task execution. Because you can control the rate at which tasks are sent from the queue, you can control the workers' scaling behavior and hence your costs.

In your case, you can control the rate at which the downstream consumer service is called.

Tudormi
  • 1,092
  • 7
  • 18
  • Thanks. Also along those lines I found the following for reference: https://firebase.googleblog.com/2017/03/how-to-schedule-cron-jobs-with-cloud.html?m=1 – simon Feb 10 '18 at 13:45
  • This answer is outdated now. See the other answer below: https://stackoverflow.com/a/56118766/3082178. – AKd Sep 08 '20 at 06:49
4

This is now possible with the current gcloud beta! You can set a max that can run at once:

gcloud beta functions deploy FUNCTION_NAME --max-instances 10 FLAGS...

See docs https://cloud.google.com/functions/docs/max-instances

zackify
  • 5,314
  • 2
  • 22
  • 28
0

You can set the number of "Function invocations per second" with quotas. It's documented here: https://cloud.google.com/functions/quotas#rate_limits

The documentation tells you how to increase it, but you can also decrease it to achieve the kind of throttling that you are looking for.

Community
  • 1
  • 1
brendan
  • 290
  • 2
  • 13
  • There's nothing you can set. That page just documents the hard limits. – Doug Stevenson Feb 12 '19 at 01:07
  • @Doug - scroll down the end end of the page and you'll see a link to increase quotas. On that page you can also decrease your quota. – brendan Feb 13 '19 at 08:03
  • 1
    The level of concurrency is not a settable quota. It's a hard limit. Go to the page and see for yourself. – Doug Stevenson Feb 13 '19 at 13:54
  • You are right @DougStevenson that the concurrent background invocations is not settable. I pasted that name in error. What you can do (as I am sure you know) is set the max requests per second - which might have the right effect. I will update my previous response. – brendan Feb 13 '19 at 14:56
0

You can control the pace at which cloud functions are triggered by controlling the triggers themselves. For example, if you have set "new file creation in a bucket" as trigger for your cloud function, then by controlling how many new files are created in that bucket you can manage concurrent execution. Such solutions are not perfect though because sometimes the cloud functions fails and get restart automatically (if you've configure your cloud function that way) without you having any control over it. In effect, the number of active instances of cloud functions will be sometimes more than you plan. What AWS is offering is a neat feature though.