0

I have a Queue() that multiple of my processes use. If I don't .get() from the Queue() before .join()-ing the processes, the Queue() gets full and I get a deadlock. So I have to read from the queue while some processes might still use it. To do this, I currently use

q = multiprocessing.Queue()
#start processes that write to it
while not q.empty():
    msg = q.get()
    output.append(msg)
print("Finished while loop")
for process in processes:
    process.join()

However, I noticed, that some processes write to the queue after the loop finishes, which is quite understandable: Might well be, that the reading is faster than the writing, and thus the queue gets empty before all the processes finish. How can I make sure the queue gets read continuously (so I don't get a deadlock), BUT the reading from the queue only stops if the processes are finished? Joining the processes before the while loop is not an option, bc that way the queue gets filled up and I get a deadlock.

lte__
  • 7,175
  • 25
  • 74
  • 131
  • Read from This Answer the example Code: http://stackoverflow.com/questions/42597341/python-threadingis-it-okay-to-read-write-multiple-mutually-exclusive-parts-of-a/42597521?noredirect=1#comment72326167_42597521 – stovfl Mar 05 '17 at 18:37
  • Well, they use multiTHREADING, I use multiPROCESSING, they're not quite the same. Will try to implement though. – lte__ Mar 07 '17 at 14:41
  • The OP asks about Threading, the Answer Example uses Processes. – stovfl Mar 07 '17 at 16:33

0 Answers0