I managed to get Django
and RabbitMQ
and Celery
work on single machine. I have followed instructions from here. Now I want to make them work together but in situation when they are on different servers. I do not want Django
knows anything about Celery
nor Celery
about Django
.
So, basically I just want in Django
to send some message to RabbitMQ
queue (probably id, type of task, maybe some other info), and then I want RabbitMQ
to publish that message (when its possible) to Celery
on another server. Celery/Django
should not know about each other, basically I want architecture where it is easy to replace any one of them.
Right now I have in my Django
several calls like
create_project.apply_async(args, countdown=10)
I want to replace that with similar calls directly to RabbitMQ (as I said Django
should not depend on Celery
). Then, RabbitMQ
should notify Celery
(when it is possible) and Celery
will do its job (probably interact with Django
but through REST
interface).
Also, I have need to have Celery
workers on two or more servers and I want RabbitMQ
to notify only one of them depending on some field in message. If this is to complicated I could just check in every task (on different machines) something like: is this is something you should do (like checking ip address field in message) and if its not than just stop with execution of task.
How can I achieve this? if possible I would prefer code + configuration examples not just theoretical explanation.
Edit:
I think that for my use case celery is total overhead. Simple RabbitMQ routing with custom clients will do the job. I already tried simple use case (one server) and it works perfectly. It should be easy to make communication multi-server ready. I do not like celery. It is "magical", hides too many details and it is not easy to configure. But I will leave this question alive, because I am interested in others opinions.