7

I have multiple servers/workers going through a task queue doing API requests. (Django with Memcached and Celery for the queue) The API requests are limited to 10 requests a second. How can I rate limit it so that the total number of requests (all servers) don't pass the limit?

I've looked through some of the related rate limit questions I'm guessing they are focused on a more linear, non concurrent scenario. What sort of approach should I take?

RS7
  • 2,341
  • 8
  • 34
  • 57

3 Answers3

0

Have you looked in Rate Limiter from Guava project? They introduced this class in one of the latest releases and it seems to partially satisfy your needs.

Surely it won't calculate rate limit across multiple nodes in distributed environment but what you coud do is to have rate limit configured dynamically based on number of nodes which are are running (ie for 5 nodes you'd have rate limit of 2 API requests a second)

Petro Semeniuk
  • 6,970
  • 10
  • 42
  • 65
0

I have been working on an opensource project to solve this exact problem called Limitd. Although I don't have clients for other technologies than node yet, the protocol and the idea are simple.

Your feedback is very welcomed.

José F. Romaniello
  • 13,866
  • 3
  • 36
  • 38
0

I solved that problem unfortunately not for your technology: bandwidth-throttle/token-bucket

If you want to implement it, here's the idea of the implementation:

It's a token bucket algorithm which converts the containing tokens into a timestamp since when it last was completely empty. Every consumption updates this timestamp (locked) so that each process shares the same state.

Markus Malkusch
  • 7,738
  • 2
  • 38
  • 67