28

i am working on celery and using rabbitmq server and created a project in django project in a server(where message queue,database exists) and it is working fine, i have created multiple workers also

from kombu import Exchange, Queue
CELERY_CONCURRENCY = 8

CELERY_ACCEPT_CONTENT = ['pickle', 'json', 'msgpack', 'yaml']

CELERY_RESULT_BACKEND = 'amqp'
CELERYD_HIJACK_ROOT_LOGGER = True
CELERY_HIJACK_ROOT_LOGGER = True
BROKER_URL = 'amqp://guest:guest@localhost:5672//'

CELERY_QUEUES = (
  Queue('default', Exchange('default'), routing_key='default'),
  Queue('q1', Exchange('A'), routing_key='routingKey1'),
  Queue('q2', Exchange('B'), routing_key='routingKey2'),
)
CELERY_ROUTES = {
 'my_taskA': {'queue': 'q1', 'routing_key': 'routingKey1'},
 'my_taskB': {'queue': 'q2', 'routing_key': 'routingKey2'},
}


AMQP_SERVER = "127.0.0.1"
AMQP_PORT = 5672
AMQP_USER = "guest"
AMQP_PASSWORD = "guest"
AMQP_VHOST = "/"`


CELERY_INCLUDE = ('functions')

`

but i want to run workers from another server.so i need some information regarding how to run a worker in another system when i referred few sites it is saying that we need to run the django project on the remote system also is it necessary?

krishna
  • 749
  • 1
  • 7
  • 6

4 Answers4

42

Here is the gist of the idea:

On Machine A:

  1. Install Celery & RabbitMQ.
  2. Configure rabbitmq so that Machine B can connect to it.
  3. Create my_tasks.py with some tasks and put some tasks in queue.

On Machine B:

  1. Install Celery.
  2. Copy my_tasks.py file from machine A to this machine.
  3. Run a worker to consume the tasks

I had the same requirement and experimented with celery. It is a lot easier to do that. I wrote a detailed blog post on that few days back. Check out how to send tasks to remote machine?

Chillar Anand
  • 27,936
  • 9
  • 119
  • 136
  • i have followed the documentation created the file as shown but while running the celery worker it is giving the following error------------- consumer: Cannot connect to amqp://krish:**@123.456.78.9:5672/321.654.5.111: [Errno 113] No route to host. Trying again in 6.00 seconds... – krishna Nov 18 '14 at 11:38
  • 1
    looks like there is a connection problem with rabbitmq. try running the worker on other machine with same config and see if it works. – Chillar Anand Nov 18 '14 at 11:44
  • i have tried same on another system but it is also raising same issue – krishna Nov 18 '14 at 12:04
  • another error has occurred when tried with a new system----Cannot connect to amqp://krish:**@123.456.78.9:5672/321.654.5.111:timed out – krishna Nov 19 '14 at 11:33
  • I could get Celery to connect to rabbitmq, but I couldn't get rabbitmq to send tasks to celery. See question here: http://stackoverflow.com/questions/40453253/amqp-connection-reset-by-peer-but-celery-connected – ABM Nov 07 '16 at 14:51
  • I install RabbitMQ on machine B instead of A. On machine B the tasks can be executed successfully, but machine A cannot get the result. – LeonF Dec 19 '16 at 18:35
  • @ChillarAnand, is there a way to callback to client after the remote celery worker has completed its task? – sattva_venu Aug 14 '20 at 12:27
  • 1
    @sattva_venu celery signals can be used for that. – Chillar Anand Aug 15 '20 at 11:47
  • @ChillarAnand thanks, will give a try on celery signals. – sattva_venu Aug 15 '20 at 17:34
  • @ChillarAnand, when celery worker is configured to run on remote host, the task.delay() runs as a synchronous call. Is this expected behavior when celery worker runs on remote host? – sattva_venu Aug 22 '20 at 07:00
  • @ChillarAnand I tried the implementation as per the link provided in the solution, it works fine. Only issue is I see the add.delay() waits for celery worker to complete its task, so it is working as synchronous call instead of asynchronous call. Did you observe the same behavior? – sattva_venu Aug 23 '20 at 04:20
  • No. add.delay just queues up the task in the broker. It won't wait for the worker to complete the task. – Chillar Anand Aug 24 '20 at 05:31
  • Do you need to replicate the Django environment and database in the remote host where the workers are ? – Florent Apr 22 '22 at 09:03
  • 1
    Yes, you need to replicate django env. You can use the same db there too. @Florent – Chillar Anand Apr 22 '22 at 09:15
  • Hi @PandikuntaAnandReddy, thanks for your answer. Do you think it's possible that workers communicate with a remote database through API ? The workers could then GET (PUT) data from (to) the machine with the Celery client. This is to avoid having multiple database for the same application. – Florent Apr 22 '22 at 11:15
  • 1
    Yes. Celery workers can communicate with remote db. Ensure remote connections are allowed in db settings. @Florent – Chillar Anand Apr 24 '22 at 12:39
13

You can make use of app.send_task() with something like the following in your django project:

from celery import Celery
import my_client_config_module

app = Celery()
app.config_from_object(my_client_config_module)

app.send_task('dotted.path.to.function.on.remote.server.relative.to.worker',
              args=(1, 2))
lajarre
  • 4,910
  • 6
  • 42
  • 69
  • How do I use `send_task` in case of `subtasks`? `app.send_task('myapp.send_push_notification', (json.dumps(payload1), ), link=app.send_task('differentapp.save_pn_response', (json.dumps(payload2), )))` – Hussain Feb 24 '16 at 09:08
  • Hussain, you can't transmit code remotely. The code you want to run must be setup on the celery instance your remotely talking to. Your just sending the name of the code to run and the arguments. Its up to you to figure out how the data gets across (if its more than some simple arguments. I recomend building a rest endpoint your remote server can query. Or maybe set up some lambdas on AWS or something. Aint no shortcuts when it comes to clustering! – Shayne Mar 12 '19 at 11:11
2

First, think about how celery really work?

Celery producer adds a task to queue with name and other important headers to identify the location of your task.

Celery does not add a complete executable function to MQ.

So, When you look at worker(consumer) side.

Celery gets task details from MQ and tries to run this. To run this task there should be available module/files/environment/codebase to execute this task.

Now lets come to your question ...

You try to set worker on a separate machine so logically to execute a function pointed by the task you need complete code environment of tasks and you should connect(Otherwise how you gonna get tasks from MQ ?) with your MQ where tasks live.

GrvTyagi
  • 4,231
  • 1
  • 33
  • 40
1

basically I will take ChillarAnand's answer. I would like to add comment on his answer, but I can't cause I don't have 50 reputation.

so...

the answer to your question...

First you would like to read "how to send tasks to remote machine?", as ChillarAnand mentioned.

That is really good article, with one small flaw, such as "does not have '@app.task' on the function def add(), in the content remote.py", it caused problem and confused me as a newbie to celery.

And the answer to "[Errno 113] No route to host." part,

I guess... I guess you have a firewall running in your rabbitmq server, you might want to have a check. Most of time, it is iptables, but it could something else. Switch it off, or change the rules. Then you can give it another try.

Community
  • 1
  • 1
bluebird_lboro
  • 601
  • 5
  • 17