I'm writing a script that will use python's multiprocessing and threading module.
For your understanding, I spawn as much processes as cores are available and inside each process I start e.g. 25 threads.
Each thread consumes from an input_queue
and produces to an output_queue
.
For the queue object I use multiprocessing.Queue
.
After my first tests I got a deadlock because the the thread responsible to feed and flush the Queue was hanging. After a while I found that I can use Queue().cancel_join_thread()
to work around this problem.
But because of the possibility of data loss I would like to use: multiprocessing.Manager().Queue()
Now actual question: Is it better to use one manager object for each queue? Or should I create one manager and get two quese from the same manager object?
# One manager for all queues
import multiprocessing
manager = multiprocessing.Manager()
input_queue = manager.Queue()
output_queue = manager.Queue()
...Magic...
# As much managers as queues
manager_in = multiprocessing.Manager()
queue_in = manager_in.Queue()
manager_out = multiprocessing.Manager()
queue_out = manager_out.Queue()
...Magic...
Thank you for your help.