I am using Dask and cython in my project, where I am invoking cython code after register with the client and collect the obtained result from cython code to my dask-python code. When I make a cluster with processes=True, It works fine. But, as soon as I write processes=False, it gives me the following error:
distributed.core - ERROR - 'tuple' object does not support item assignment
Traceback (most recent call last):
File "/home/user/anaconda3/envs/Dask/lib/python3.7/site-packages/distributed/core.py", line 555, in handle_stream
msgs = await comm.read()
File "/home/user/anaconda3/envs/Dask/lib/python3.7/site-packages/distributed/comm/inproc.py", line 199, in read
msg = nested_deserialize(msg)
File "/home/user/anaconda3/envs/Dask/lib/python3.7/site-packages/distributed/protocol/serialize.py", line 549, in nested_deserialize
return replace_inner(x)
File "/home/user/anaconda3/envs/Dask/lib/python3.7/site-packages/distributed/protocol/serialize.py", line 541, in replace_inner
x[k] = replace_inner(v)
File "/home/user/anaconda3/envs/Dask/lib/python3.7/site-packages/distributed/protocol/serialize.py", line 534, in replace_inner
x[k] = deserialize(v.header, v.frames)
File "/home/user/anaconda3/envs/Dask/lib/python3.7/site-packages/distributed/protocol/serialize.py", line 388, in deserialize
return loads(header, frames)
File "/home/user/anaconda3/envs/Dask/lib/python3.7/site-packages/distributed/protocol/serialize.py", line 83, in pickle_loads
buffers[i] = buf.cast(mv.format)
TypeError: 'tuple' object does not support item assignment
distributed.worker - ERROR - 'tuple' object does not support item assignment
The following code snippet I am using:
from dask.distributed import Client, LocalCluster,wait
cluster=LocalCluster(processes=False)
client=Client(cluster)
client.register_worker_callbacks(init_pyx) ## init_pyx is a function by which worker register to cython code
df = dd.read_csv(path_to_csv_file)
processed_df=df.map_partitions(lambda part: handle_partition(part),meta=meta) #handle partition is function which uses cython code to preprocess each partition of dataframe, meta is metadata related to preproceesed part
client.close()
Also, earlier I was getting the problem related to the memory view, but as I reinstalled the dask, the memory view error vanished. Now a new error related to "TypeError: 'tuple' object does not support item assignment" is coming.
I am also new to Dask, If anyone has an idea how to tackle it, please help me in this regard. Thanks in advance.