I am writing some python code using multiprocessing & pathos. I have written a small test program to get used to using the mutiprocessing, that runs fine on my local machine, but it refuses to run on a different cluster.
I am getting the following error:
Traceback (most recent call last):
File "./multi.py", line 116, in <module>
pool = pathos_multiprocessing.Pool(processes=pool_size,maxtasksperchild=1,)
File "/usr/local/lib/python3.4/dist-packages/multiprocess/pool.py", line 150, in __init__
self._setup_queues()
File "/usr/local/lib/python3.4/dist-packages/multiprocess/pool.py", line 243, in _setup_queues
self._inqueue = self._ctx.SimpleQueue()
File "/usr/local/lib/python3.4/dist-packages/multiprocess/context.py", line 110, in SimpleQueue
from .queues import SimpleQueue
File "/usr/local/lib/python3.4/dist-packages/multiprocess/queues.py", line 22, in <module>
import _multiprocess as _multiprocessing
ImportError: No module named '_multiprocess'
but when I do a
pip3 list
both pathos and multiprocessing modules are clearly there:
multiprocess (0.70.4)
nbconvert (4.2.0)
nbformat (4.0.1)
nose (1.3.1)
notebook (4.2.0)
numpy (1.10.4)
oauthlib (0.6.1)
pathos (0.2.0)
Any bright ideas why this might be happening would be welcome!
The small test code is:
#! /usr/bin/env python3
import pathos.multiprocessing as mp
import os
import random
class Pool_set:
def pool_fun(directory_name):
cwd=os.getcwd()
os.mkdir(str(directory_name))
directory=os.path.join(cwd,str(directory_name))
os.chdir(directory)
os.system('{}'.format('sleep '+str(directory_name)))
cwd2=os.getcwd()
print(cwd2)
test_file = open('test_file.out','w')
test_file.write(cwd2)
print("Finished in "+directory)
os.chdir(cwd)
if __name__ == '__main__':
config=[]
pool_set = Pool_set
for i in (random.sample(range(1,100),3)):
config.append(i)
pool_size = mp.cpu_count()
pool = mp.Pool(processes=pool_size,maxtasksperchild=1,)
pool_outputs = pool.map(pool_set.pool_fun,config)
pool.close()
pool.join()