I have a Python program that starts N subprocesses (clients) which send requests to and listen for responses from the main process (server). The interprocess communication uses pipes through multiprocessing.Queue
objects according to the following scheme (one queue per consumer, so one request queue and N response queues):
1 req_queue
<-- Process-1
MainProcess <-- ============= <-- …
<-- Process-N
N resp_queues
--> ============= --> Process-1
MainProcess --> ============= --> …
--> ============= --> Process-N
The (simplified) program:
import multiprocessing
def work(event, req_queue, resp_queue):
while not event.is_set():
name = multiprocessing.current_process().name
x = 3
req_queue.put((name, x))
print(name, 'input:', x)
y = resp_queue.get()
print(name, 'output:', y)
if __name__ == '__main__':
event = multiprocessing.Event()
req_queue = multiprocessing.Queue()
resp_queues = {}
processes = {}
N = 10
for _ in range(N): # start N subprocesses
resp_queue = multiprocessing.Queue()
process = multiprocessing.Process(
target=work, args=(event, req_queue, resp_queue))
resp_queues[process.name] = resp_queue
processes[process.name] = process
process.start()
for _ in range(100): # handle 100 requests
(name, x) = req_queue.get()
y = x ** 2
resp_queues[name].put(y)
event.set() # stop the subprocesses
for process in processes.values():
process.join()
The problem that I am facing is that the execution of this program (under Python 3.11.2) sometimes never stops, hanging at the line y = resp_queue.get()
in some subprocess once the main process notify subprocesses to stop at the line event.set()
. The problem is the same if I use the threading
library instead of the multiprocessing
library.
How to stop the subprocesses?