4

I am interesting in using Dask Distributed as task executor. In Celery it is possible to assign task to specific worker. How is it possible using Dask Distributed?

Sklavit
  • 2,225
  • 23
  • 29

1 Answers1

5

There are 2 options:

  1. Specify workers by name or host or IP (but only positive declarations):

    dask-worker scheduler_address:8786 --name worker_1
    

    and then one of option:

    client.map(func, sequence, workers='worker_1')
    client.map(func, sequence, workers=['192.168.1.100', '192.168.1.100:8989', 'alice', 'alice:8989'])
    client.submit(f, x, workers='127.0.0.1')
    client.submit(f, x, workers='127.0.0.1:55852')
    client.submit(f, x, workers=['192.168.1.101', '192.168.1.100'])
    future = client.compute(z, workers={z: '127.0.0.1',
                                    x: '192.168.0.1:9999'})
    future = client.compute(z, workers={(x, y): ['192.168.1.100', '192.168.1.101:9999']})
    
  2. Use Resources concept. You can specify available resources to worker like:

    dask-worker scheduler:8786 --resources "CAN_PROCESS_QUEUE_ALICE=2"
    

    and specify required resources like

    client.submit(aggregate, processed, resources={'CAN_PROCESS_QUEUE_ALICE': 1})
    

    or

    z = some_dask_object.map_parititons(func)
    z.compute(resources={tuple(y.__dask_keys__()): {'CAN_PROCESS_QUEUE_ALICE': 1})
    
Sklavit
  • 2,225
  • 23
  • 29