7

Say I have a celery worker that depends on a large module, X.

Since task definitions require a reference to the worker app def (e.g., @app.task) this implies that my "client" (the code scheduling the task) needs also to depend on this module.

This doesn't make sense to me -- have I got this wrong?

A). I don't want my task caller to have these dependencies (e.g., they might be in different docker containers).

B). For security reasons I don't want my task caller to have access to this code.

Is there a way around it?

Thanks,

RB

rsb
  • 1,020
  • 1
  • 10
  • 25

2 Answers2

5

Your client code can start tasks remotely without having to import the implementation of the tasks. You must obviously configure the client to connect to the same broker as the workers but once that is done, then you can use signatures to invoke the tasks:

import celery
result = celery.signature("app.tasks.foo", args=(1, )).delay().get()

The first parameter to celery.signature is the name of the task. It is typically the absolute name of the module that contains the task (e.g. app.tasks in the code above) plus the task name (foo).

Louis
  • 146,715
  • 28
  • 274
  • 320
  • Thanks Louis. Here's the problem though: In my module worker.py, I define "app" which has the entire Celery configuration. But the "client" of the celery worker only needs the transport config (e.g., BROKER_URL). Yet the "client" (caller) also has to reference tasks.py and this module need to refer to a Celery app instance. Why should the actual worker and clients of the worker depend on the same app instance? I understand they both need the same transport config, but there is nothing else in common? – rsb Sep 24 '16 at 17:34
  • 1
    I guess, my issue is which config properties are required for the caller (given the docs assume the same instance is used for both) and in my case the caller and worker are in different docker containers. – rsb Sep 24 '16 at 20:01
2

One way to achieve this will be using celery's app.send_task method. You can configure the client to connect to the same broker as the workers using the below line of code.

from celery import Celery
#app = Celery()
#app.config_from_object('celeryconfig') #OR below line
app = Celery('tasks',broker='redis://localhost', backend='redis://localhost')

Then you can send task to broker(queue) using below line of code, without importing any worker code module:

celery.send_task('tasks.add', (2,2))

Any worker which is connected to the broker will pick up the task and execute it. You may need to add name argument in the decorator @app.task(name='add') in the workers code module. check out this celerey thread for more information.

Alok Nayak
  • 2,381
  • 22
  • 28