0

I'm having some issue using a throttler for telegram api.

The problem is basically that if number of requests goes over my throttler limit, when the min passes, the messages get sent randomly.

Here's the code for the throttler I'm using (Found it on some github)

class Throttler:
    def __init__(self, rate_limit, period=1.0, retry_interval=0.01):
        self.rate_limit = rate_limit
        self.period = period
        self.retry_interval = retry_interval

        self._task_logs = deque()

    def flush(self):
        now = time.time()
        while self._task_logs:
            if now - self._task_logs[0] > self.period:
                self._task_logs.popleft()
            else:
                break


    async def acquire(self):
        while True:
            self.flush()
            if len(self._task_logs) < self.rate_limit:
                break
            await asyncio.sleep(self.retry_interval)

        self._task_logs.append(time.time())

    async def __aenter__(self):
        await self.acquire()


    async def __aexit__(self, exc_type, exc, tb):
        pass
I can use this as following
throttler = Throttler(rate_limit=30, period=10)

async with throttler:
    await sendmessage(message)

Andrea Hasani
  • 116
  • 2
  • 11

1 Answers1

0

Found out that the best way to get around this was using a different algorithm for the throttler.

The throttler I was using above would always deliver messages randomly because after an initial burst, messages will get stuck in the queue and when the time has passed, asyncio will release all messages at once.

I found out the best way around this is to use what's called a LeakyBucket algorithm. I used the following answer to implement a LeakyBucket myself https://stackoverflow.com/a/45502319/7055234

Andrea Hasani
  • 116
  • 2
  • 11