I am using django celery and rabbitmq as my broker (guest rabbit user has full access on local machine). I have a bunch of projects all in their own virtualenv but recently needed celery on 2 of them. I have one instance of rabbitmq running
(project1_env)python manage.py celery worker
normal celery stuff....
[Configuration]
broker: amqp://guest@localhost:5672//
app: default:0x101bd2250 (djcelery.loaders.DjangoLoader)
[Queues]
push_queue: exchange:push_queue(direct) binding:push_queue
In my other project
(project2_env)python manage.py celery worker
normal celery stuff....
[Configuration]
broker: amqp://guest@localhost:5672//
app: default:0x101dbf450 (djcelery.loaders.DjangoLoader)
[Queues]
job_queue: exchange:job_queue(direct) binding:job_queue
When I run a task in project1 code it fires to the project1 celery just fine in the push_queue. The problem is when I am working in project2 any task tries to fire in the project1 celery even if celery isn't running on project1.
If I fire back up project1_env and start celery I get
Received unregistered task of type 'update-jobs'.
If I run list_queues
in rabbit, it shows all the queues
...
push_queue 0
job_queue 0
...
My env settings and CELERYD_CHDIR
and CELERY_CONFIG_MODULE
are both blank.
Some things I have tried:
- purging celery
- force_reset on rabbitmq
- rabbitmq virtual hosts as outlined in this answer: Multi Celery projects with same RabbitMQ broker backend process
- moving django celery setting out and setting CELERY_CONFIG_MODULE to the proper settings
- setting the CELERYD_CHDIR in both projects to the proper directory
None of these thing have stopped project2 tasks trying to work in project1 celery.
I am on Mac, if that makes a difference or helps.
UPDATE
Setting up different virtual hosts made it all work. I just had it configured wrong.