I have an application which uses producer-consumer pattern: multiple producers, one consumer. The catch here is that inside each producer, after submitting the task, I want to wait and retrieve the result inside the same producer process:
from multiprocessing import Process
from queue import Queue
def producer(queue, work_item):
queue.put(work_item)
# ??? How to wait for the work_item to be done, and get back the result of processing work_item???
# "result" is the result of producer processing this item. I want to print the result "INSIDE THE PRODUCER PROCESS"
print("the result of processing {} is {}".format(work_item, result))
def consumer(queue) :
while True:
item = queue.get()
# process the item
result = process(item)
# ??? How to pass this result back to producer???
# processing done
queue.task_done()
if __name__== "__main__":
queue=Queue()
p1 = Process(target=producer, args=(queue,1))
p2 = Process(target=producer, args=(queue,2))
p3 = Process(target=producer, args=(queue,3))
c1 = Process(target=consumer, args=(queue,))
p1.start()
p2.start()
p3.start()
c1.start()
Why I want to put task submission and result retrieval inside the same producer?
Because in my actual application:
- the producer could be a web request handler in Flask web server,
- the consumer could be an tensorflow image classifier.
In each web request, user submit an image for classification, but classification is an time-consuming task, so the web request handler (my producer) then delegate this task to the tensorflow image classifying process (my consumer) to do the work, but the web request handler must wait for the result, and pass it back to user.
So how to wait and retrieve result in producer-consumer pattern?
ps. I have already tried Celery. But here I am more interested in a Core Python implementation (or general concurrency pattern), if there is any.