I need to query a service which is very poor at handling concurrent requests. Therefore I need to control the number of concurrent requests by implementing locking/queueing on my side.
I have N processes which running in parallel, and independently (kicked off separately), so multiprocessing lock doesn't work for me.
python send_requests.py
python send_requests.py
python send_requests.py
...
Each process can submit many requests to the service sequentially. The lock needs to be lightweight and do not want to use database. It doesn't need to be super accurate.
One way I can think of is to use file system. When a request is submitted, it'll create a signal file on file system; when it finishes, it'll remove the signal file. I'll check wither number of signal files <= limit
before submitting each request, and wait & loop. If it's still above limit until timeout, I'll submit the request anyway.
Is there better way than this?