3

I have a Django 1.62 application running on Debian 7.8 with Nginx 1.2.1 as my proxy server and Gunicorn 19.1.1 as my application server. I've installed Celery 3.1.7 and RabbitMQ 2.8.4 to handle asynchronous tasks. I'm able to start a Celery worker as a daemon but whenever I try to run the test "add" task as shown in the Celery docs, I get the following error:

Received unregistred task of type u'apps.photos.tasks.add'.
The message has been ignored and discarded.

Traceback (most recent call last):
File "/home/swing/venv/swing/local/lib/python2.7/site-packages/celery/worker/consumer.py", line 455, in on_task_received
strategies[name](message, body,
KeyError: u'apps.photos.tasks.add'

All of my configuration files are kept in a "conf" directory that sits just below my "myproj" project directory. The "add" task is in apps/photos/tasks.py.

myproj
│
├── apps
    ├── photos
    │   ├── __init__.py
    │   ├── tasks.py
    conf
    ├── celeryconfig.py
    ├── celeryconfig.pyc
    ├── celery.py
    ├── __init__.py
    ├── middleware.py
    ├── settings
    │   ├── base.py
    │   ├── dev.py
    │   ├── __init__.py
    │   ├── prod.py
    ├── urls.py
    ├── wsgi.py

Here is the tasks file:

# apps/photos/tasks.py
from __future__ import absolute_import
from conf.celery import app

@app.task
def add(x, y):
    return x + y

Here are my Celery application and configuration files:

# conf/celery.py
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
from conf import celeryconfig

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'conf.settings')
app = Celery('conf')
app.config_from_object(celeryconfig)
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))

# conf/celeryconfig.py
BROKER_URL = 'amqp://guest@localhost:5672//'
CELERY_RESULT_BACKEND = 'amqp'
CELERY_ACCEPT_CONTENT = ['json', ]
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'

This is my Celery daemon config file. I commented out CELERY_APP because I've found that the Celery daemon won't even start if I uncomment it. I also found that I need to add the "--config" argument to CELERYD_OPTS in order for the daemon to start. I created a non-privileged "celery" user who can write to the log and pid files.

# /etc/default/celeryd
CELERYD_NODES="worker1"
CELERYD_LOG_LEVEL="DEBUG"
CELERY_BIN="/home/myproj/venv/myproj/bin/celery"
#CELERY_APP="conf"
CELERYD_CHDIR="/www/myproj/"
CELERYD_OPTS="--time-limit=300 --concurrency=8 --config=celeryconfig"
CELERYD_LOG_FILE="/var/log/celery/%N.log"
CELERYD_PID_FILE="/var/run/celery/%N.pid"
CELERYD_USER="celery"
CELERYD_GROUP="celery"
CELERY_CREATE_DIRS=1

I can see from the log file that when I run the command, "sudo service celeryd start", Celery starts without any errors. However, if I open the Python shell and run the following commands, I'll see the error I described at the beginning.

$ python shell
In [] from apps.photos.tasks import add
In [] result = add.delay(2, 2)

What's interesting is that if I examine Celery's registered tasks object, the task is listed:

In [] import celery
In [] celery.registry.tasks

Out [] {'celery.chain': ..., 'apps.photos.tasks.add': <@task: apps.photos.tasks.add of conf:0x16454d0> ...}

Other similar questions here have discussed having a PYTHONPATH environment variable and I don't have such a variable. I've never understood how to set PYTHONPATH and this project has been running just fine for over a year without it.

I should also add that my production settings file is conf/settings/prod.py. It imports all of my base (tier-independent) settings from base.py and adds some extra production-dependent settings.

Can anyone tell me what I'm doing wrong? I've been struggling with this problem for three days now.

Thanks!

Chillar Anand
  • 27,936
  • 9
  • 119
  • 136
Jim
  • 13,430
  • 26
  • 104
  • 155

2 Answers2

2

Looks like it is happening due to relative import error.

>>> from project.myapp.tasks import mytask
>>> mytask.name
'project.myapp.tasks.mytask'

>>> from myapp.tasks import mytask
>>> mytask.name
'myapp.tasks.mytask'

If you’re using relative imports you should set the name explicitly.

@task(name='proj.tasks.add')
def add(x, y):
   return x + y

Checkout: http://celery.readthedocs.org/en/latest/userguide/tasks.html#automatic-naming-and-relative-imports

Chillar Anand
  • 27,936
  • 9
  • 119
  • 136
  • My tasks.py file is in apps.photos.tasks so I changed the decorator for the add method to "@app.task(name='apps.photos.tasks.add') but I still get the KeyError. – Jim Apr 10 '15 at 22:19
  • 1
    if you start your worker with `-l info` option does your task name is shown at the beginning? – Chillar Anand Apr 11 '15 at 05:29
  • 1
    No, I tried that too and noticed that no tasks were showing. I've started building a test Django application and am reading the Celery docs a page at at time and trying different things to try and understand where the problem is in my configuration. It's been five days now. This is the longest I've ever worked on a problem without finding a solution. – Jim Apr 11 '15 at 17:47
  • The question I'm trying to figure out right now is how the Initd celeryd script knows which file in your project contains the line "app = Celery('tasks')"? In my test Django project, this line is in the file myproj/conf/tasks.py (where myproj is the root directory for the project and my other settings files like settings.py, urls.py, and wsgi.py are in the "conf" subdirectory. Celery starts fine if I don't have CELERY_APP in my celeryd file. How does celeryd know where the Celery app is defined? – Jim Apr 11 '15 at 18:31
  • I *finally* figured it out. – Jim Apr 12 '15 at 05:11
  • 1
    @Robert How did you "figure" it out? What was the answer. I know this is old, but help would be appreciated. – Adam Hopkins Sep 12 '16 at 10:22
  • @TheBrewmaster to be honest, it's been so long ago that I don't remember. In fact, I switched from Celery to RQ because Celery is so difficult to work with and so bad at telling you why it won't start. I just got fed up with Celery and switched to something else. Sorry I can't be of more help. – Jim Sep 12 '16 at 22:43
  • @Robert I understand. It was a long shot that you would remember. I am going to stick with Celery for now, and maybe use supervisor instead to run it. – Adam Hopkins Sep 13 '16 at 05:44
  • @Ray next time, please take the deference to leave the answer that worked for you. You had already written that you'd figure it out, so it was a little more of text... – Sebastialonso Apr 08 '20 at 17:47
2

I'm using celery 4.0.2 and django, and I created a celery user and group for use with celeryd and had this same problem. The command-line version worked fine, but celeryd was not registering the tasks. It was NOT a relative naming problem.

The solution was to add the celery user to the group that can access the django project. In my case, this group is www-data with read, execute, and no write.

vt100rlz
  • 21
  • 1