I am looking for a python library / framework that manages task distribution (e.g. a task queue).
However, tasks will require specialized workers: Worker A can only handle tasks of type a
, workers B and C only of type b
etc.
Also, these workers will run on different computers and cannot share the same codebase (since, like in a fabrication line, each task is bound to controlling specific hardware, which only one computer has access to).
I have looked at libraries like python RQ or Celery, but if I understand correctly, they require the same codebase to run on the different workers and are meant for distributing computation. What I am looking for is basically just the management of an abstract task queue and a mechanism that workers can fetch tasks over the network. A task then is basically just some data and meta information about it's progress, errors, results etc. A bonus would be if tasks could also depend on one another, so that a task can depend on the outcome of another task.
Is there a simple library, that takes care of managing the queue, the network protocol etc., that provides what I'm looking for?