3

I have a use case where I need to poll the API every 1 sec (basically infinite while loop). The polling will be initiated dynamically by user through an external system. This means there can be multiple polling running at the same time. The polling will be completed when the API returns 400. Anyways, my current implementation looks something like:

  1. Flask APP deployed on heroku.
  2. Flask APP has an endpoint which external system calls to start polling.
  3. That flask endpoint will add the message to queue and as soon as worker gets it, it will start polling. I am using Heroku Redis to Go addons. Under the hood it uses python-rq and redis.

The problem is when some polling process goes on for a long time, the other process just sits on the queue. I want to be able to do all of the polling in a concurrent process.

What's the best approach to tackle this problem? Fire up multiple workers?

What if there could be potentially more than 100 concurrent processes.

Vaibhav Jadhav
  • 2,020
  • 1
  • 7
  • 20
Joyfulgrind
  • 2,762
  • 8
  • 34
  • 41

1 Answers1

0

You could implement a "weighted"/priority queue. There may be multiple ways of implementing this, but the simplest example that comes to my mind is using a min or max heap.

You shoud keep track of how many events are in the queue for each process, as the number of events for one process grows, the weight of the new inserted events should decrease. Everytime an event is processed, you start processing the following one with the greatest weight.

PS More workers will also speed up the the work.

Mihail Feraru
  • 1,419
  • 9
  • 17