0

Now, I want to publish register event to some special exchange, which I could use the celery to retrieve and process it remotely.

Actually, I have used the send_task function to realize this, but it must pass the task_name to indicate which task should execute and consumer it. So it seems not so perfectly for my target.

What I want just like that:

  1. Publish register message to certain Exchange;
  2. Remote machine 1 subscribe this topic or route_key and catch the message, using is for executing task;
  3. Remote machine 2- the same as machine 1 but execute another task- receive that (may be will need reply to certain queue)

For example, just like this workflow:

register:

  • send_email
  • generate_info

    ......

Derek
  • 21
  • 3

1 Answers1

0

This is what I do if I need non-standard exchange.

In my celeryconfig I specify that exchange and assign queue to it like this (in my case I need fanout exchange):

from kombu.common import Broadcast
from kombu import Exchange, Queue
CELERY_QUEUES = (
   Broadcast(name='queue_name', exchange=Exchange('queue_name', type='fanout')),
)

I then spawn worker with celery multi and assign it to my specific queue like this:

celery multi start 1 -A my_project -Q:1 queue_name -c:1 1 (other options go here)

And then I can insert my task to that queue like this:

from my_project import my_fancy_task
my_fancy_task.apply_async(args=(x, y, z), queue="my_queue")

I do not quite understand your specific use case, if you need worker on one host to consume tasks from one queue and then worker on another host to consume tasks from another queue then just split your tasks into two queues and configure each host to start workers and assign them to whatever queue makes sense for you. Maybe this will help: Topic exchange ambiguity with RabbitMQ

Greg0ry
  • 931
  • 8
  • 25
  • Thanks for your generous help, and I indeed want to make producer and consumer separated, so I can publish one message and consumer it by different host. In your solution, I think I have to call the certain task, which contrary to my goal that call multiple task at once. Yesterday, I found a solution just using lower api of Celery , and it worked for me partly even though it didn't seems beautiful. – Derek Jul 25 '18 at 08:07
  • In such case you should not need to worry about exchange, just start your worker on second host and make sure that worker will connect to the same `rabbitmq` queue as your producer. I personally run clustered `rabbitmq` on my hosts but you do not have to do that - you can simply run `rabbitmq` on one host and connect consumers and publishers to it. When you run your consumers make sure you specify your specific queues using `-Q` option. I still have trouble to understand this: `my goal that call multiple task at once` - what do you mean by that? – Greg0ry Jul 25 '18 at 18:20