Based on my experiments I'm guessing the answer to this is no. But perhaps it might be possible with some changes to the futures module.
I would like to submit a worker that itself creates an executor and submits work. I want to return that second future to the main process. I have this MWE, which does not work because the f2
object likely becomes disassociated from its parent executor when it is sent over via multiprocessing. (It does work if both executors are ThreadPoolExecutor, because the f2
object is never copied).
from concurrent.futures import ProcessPoolExecutor, ThreadPoolExecutor
import time
def job1():
try:
ex2 = ThreadPoolExecutor()
time.sleep(2)
f2 = ex2.submit(job2)
finally:
ex2.shutdown(wait=False)
return f2
def job2():
time.sleep(2)
return 'done'
try:
ex1 = ProcessPoolExecutor()
f1 = ex1.submit(job1)
finally:
ex1.shutdown(wait=False)
print('f1 = {!r}'.format(f1))
f2 = f1.result()
print('f1 = {!r}'.format(f1))
print('f2 = {!r}'.format(f2))
My question is: Is there any safe way that I might send a future object across a multiprocessing Pipe and be able to receive the value when it is finished. It seems like I might need to set up another executor-like construct that listens for results over another Pipe.