22

I want to submit functions with Dask that have large (gigabyte scale) arguments. What is the best way to do this? I want to run this function many times with different (small) parameters.

Example (bad)

This uses the concurrent.futures interface. We could use the dask.delayed interface just as easily.

x = np.random.random(size=100000000)  # 800MB array
params = list(range(100))             # 100 small parameters

def f(x, param):
    pass

from dask.distributed import Client
c = Client()

futures = [c.submit(f, x, param) for param in params]

But this is slower than I would expect or results in memory errors.

MRocklin
  • 55,641
  • 23
  • 163
  • 235

1 Answers1

34

OK, so what's wrong here is that each task contains the numpy array x, which is large. For each of the 100 tasks that we submit we need to serialize x, send it up to the scheduler, send it over to the worker, etc..

Instead, we'll send the array up to the cluster once:

[future] = c.scatter([x])

Now future is a token that points to an array x that lives on the cluster. Now we can submit tasks that refer to this remote future, instead of the numpy array on our local client.

# futures = [c.submit(f, x, param) for param in params]  # sends x each time
futures = [c.submit(f, future, param) for param in params]  # refers to remote x already on cluster

This is now much faster, and lets Dask control data movement more effectively.

Scatter data to all workers

If you expect to need to move the array x to all workers eventually then you may want to broadcast the array to start

[future] = c.scatter([x], broadcast=True)

Use Dask Delayed

Futures work fine with dask.delayed as well. There is no performance benefit here, but some people prefer this interface:

# futures = [c.submit(f, future, param) for param in params]

from dask import delayed
lazy_values = [delayed(f)(future, param) for param in params]
futures = c.compute(lazy_values)
MRocklin
  • 55,641
  • 23
  • 163
  • 235
  • 8
    Thanks, very useful! The usage of the future returned by the scatter command (as an argument to a function in your example) is not explained in the documentation. – PierreE Apr 11 '18 at 22:46
  • 1
    within a task (`f` above) is it possible to reference `future` (the broadcasted cluster version of `x`) w/o passing passing it as an arg to `submit`? For example in spark you can broadcast a variable and then just reference it in your tasks using a the global context. – eggie5 May 04 '20 at 03:46