3

I am using Dask and cython in my project, where I am invoking cython code after register with the client and collect the obtained result from cython code to my dask-python code. When I make a cluster with processes=True, It works fine. But, as soon as I write processes=False, it gives me the following error:

 distributed.core - ERROR - 'tuple' object does not support item assignment
Traceback (most recent call last):
  File "/home/user/anaconda3/envs/Dask/lib/python3.7/site-packages/distributed/core.py", line 555, in handle_stream
    msgs = await comm.read()
  File "/home/user/anaconda3/envs/Dask/lib/python3.7/site-packages/distributed/comm/inproc.py", line 199, in read
    msg = nested_deserialize(msg)
  File "/home/user/anaconda3/envs/Dask/lib/python3.7/site-packages/distributed/protocol/serialize.py", line 549, in nested_deserialize
    return replace_inner(x)
  File "/home/user/anaconda3/envs/Dask/lib/python3.7/site-packages/distributed/protocol/serialize.py", line 541, in replace_inner
    x[k] = replace_inner(v)
  File "/home/user/anaconda3/envs/Dask/lib/python3.7/site-packages/distributed/protocol/serialize.py", line 534, in replace_inner
    x[k] = deserialize(v.header, v.frames)
  File "/home/user/anaconda3/envs/Dask/lib/python3.7/site-packages/distributed/protocol/serialize.py", line 388, in deserialize
    return loads(header, frames)
  File "/home/user/anaconda3/envs/Dask/lib/python3.7/site-packages/distributed/protocol/serialize.py", line 83, in pickle_loads
    buffers[i] = buf.cast(mv.format)
TypeError: 'tuple' object does not support item assignment
distributed.worker - ERROR - 'tuple' object does not support item assignment

The following code snippet I am using:

from dask.distributed import Client, LocalCluster,wait
cluster=LocalCluster(processes=False)
client=Client(cluster)
client.register_worker_callbacks(init_pyx)  ## init_pyx is a function by which worker register to cython code
df = dd.read_csv(path_to_csv_file)
processed_df=df.map_partitions(lambda part: handle_partition(part),meta=meta)  #handle partition is function which uses cython code to preprocess each partition of dataframe, meta is metadata related to preproceesed part
client.close() 

Also, earlier I was getting the problem related to the memory view, but as I reinstalled the dask, the memory view error vanished. Now a new error related to "TypeError: 'tuple' object does not support item assignment" is coming.

I am also new to Dask, If anyone has an idea how to tackle it, please help me in this regard. Thanks in advance.

Vivek kala
  • 23
  • 3
Rahul
  • 31
  • 3
  • 2
    Please share your code, a minimal reproducible example. Do you, for example, have a memoryview with a shape or strides containing zeros? – mdurant Mar 30 '21 at 12:55
  • Dear mdurant, The problem is resolved when i reinstalled the dask. Thank you for your kind response. But kindly, check the post again which state weird issue. – Rahul Apr 01 '21 at 07:33
  • How are you installing and which exact version of dask and distributed do you have? – mdurant Apr 01 '21 at 12:57
  • I am using conda to install dask. I am using the same version of dask and distributed which is (2021.3.1). – Rahul Apr 01 '21 at 15:21

0 Answers0