Using dask distributed i try to submit a function that is located in another file named worker.py. In workers i've the following error :
No module named 'worker'
However I'm unable to figure out what i'm doing wrong here ...
Here is a sample of my code:
import worker
def run(self):
dask_queue = queue.Queue()
remote_queue = self.executor.scatter(dask_queue)
map_queue = self.executor.map(worker.run, remote_queue)
result = self.executor.gather(map_queue)
# Load data into the queue
for option in self.input.get_next_option():
remote_queue.put([self.server, self.arg, option])
Here is the complete traceback obtained on the worker side:
distributed.core - INFO - Failed to deserialize b'\x80\x04\x95\x19\x00\x00\x00\x00\x00\x00\x00\x8c\x06worker\x94\x8c\nrun\x94\x93\x94.' Traceback (most recent call last): File "/usr/local/lib/python3.5/dist-packages/distributed/core.py", line 74, in loads return pickle.loads(x) ImportError: No module named 'worker' distributed.worker - WARNING - Could not deserialize task Traceback (most recent call last): File "/usr/local/lib/python3.5/dist-packages/distributed/worker.py", line 496, in compute_one task) File "/usr/local/lib/python3.5/dist-packages/distributed/worker.py", line 284, in deserialize function = loads(function) File "/usr/local/lib/python3.5/dist-packages/distributed/core.py", line 74, in loads return pickle.loads(x) ImportError: No module named 'worker'