12

I have being trying to setup django + celery + redis + celery_beats but it is giving me trouble. The documentation is quite straightforward, but when I run the django server, redis, celery and celery beats, nothing gets printed or logged (all my test task does its log something).

This is my folder structure:

- aenima 
 - aenima
   - __init__.py
   - celery.py

 - criptoball
   - tasks.py

celery.py looks like this:

from __future__ import absolute_import, unicode_literals
import os
from django.conf import settings
from celery import Celery


# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'aenima.settings')

app = Celery("criptoball")
app.conf.broker_url = 'redis://localhost:6379/0'

# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
app.conf.timezone = 'UTC'

@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))

app.conf.beat_schedule = {
    'test-every-30-seconds': {
        'task': 'tasks.test_celery',
        'schedule': 30.0,
        'args': (16, 16)
    }, }

and tasks.py looks like this:

from __future__ import absolute_import, unicode_literals
from datetime import datetime, timedelta
from celery import shared_task
import logging

from django_celery_beat.models import PeriodicTask, IntervalSchedule

cada_10_seg = IntervalSchedule.objects.create(every=10, period=IntervalSchedule.SECONDS)

test_celery_periodic = PeriodicTask.objects.create(interval=cada_10_seg, name='test_celery', task='criptoball.tasks.test_celery',
expires=datetime.utcnow()+timedelta(seconds=30))

@shared_task
def test_celery(x, y):
    logger = logging.getLogger("AENIMA")
    print("EUREKA")
    logger.debug("EUREKA")

This is the django_celery_beat docs

Not sure what am I missing. When I run

celery -A aenima beat -l debug --scheduler django_celery_beat.schedulers:DatabaseScheduler

celery -A aenima worker -l debug

redis-cli ping PONG

django runserver and redis server, I get nothing printed.

settings.py

CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = TIME_ZONE
CELERY_IMPORTS = ('criptoball.tasks',)

Haven't found any authorative answer to this topic in SO so far.

I would like to solve it all, this error may be just one of many. Thanks a lot for your help!

Edit:

Added settings for redis, declared the the task differently and increased debug level. Now the error is:

Received unregistered task of type u'tasks.test_celery'. The message has been ignored and discarded.

Did you remember to import the module containing this task? Or maybe you're using relative imports? KeyError: u'aenima.criptoball.tasks.test_celery'

I believe Celery's documentation is poor.

EDIT 2 After trying everything, it worked when I put all the tasks inside the same celery.py file. the @shared_task doesn't work, had to use @app.task .

Alejandro Veintimilla
  • 10,743
  • 23
  • 91
  • 180
  • 1
    Do you have a celery worker running? Eg. from the command line, `celery worker -A `. Stuff should print in the terminal where you started that. – Chris Feb 03 '18 at 22:10
  • @Chris updated the answer, added a bounty. – Alejandro Veintimilla Mar 24 '18 at 11:55
  • Looks like `celery -A aenima beat celery -A aenima worker -l info` may be a transcription/formatting error. Did you mean to put these on two separate lines? – sytech Mar 28 '18 at 13:18
  • @sytech yes. Thanks for the comment, corrected it. – Alejandro Veintimilla Mar 28 '18 at 13:41
  • Consider raising the log level to debug for both beat and the workers and see if that produces any relevant output. Otherwise I believe the standard behavior is to suppress stdout, so your `print` and `.debug` may not show up even if the task is in fact running. In my experience, I've also had to specify the scheduler as `DatabaseScheduler` to resolve a similar issue with celery_results, but not sure if that's applicable here. – sytech Mar 28 '18 at 13:48
  • @sytech just did that. Got a new error message. Any ideas? – Alejandro Veintimilla Mar 28 '18 at 14:27
  • When starting the celery worker in debug mode, it should tell you what tasks are registered. If a task is not registered with the worker, the worker will not pick it up from the queue. If your tasks are not being registered, try changing `Celery('criptoball')` to `Celery('aenima')`, as the docs say to use the *project* name. My hunch is this throws off how `autodiscover_tasks` finds your tasks. Also I'm not sure why you have that `lambda` in autodiscover tasks, could be suspect. – sytech Mar 28 '18 at 14:38
  • If you *do* see tasks registered with the worker... Then the problem is the name you are using when setting up the schedule. Try using the django admin interface instead or look to make sure the names you're using match the names the worker shows. See [automatic naming](http://docs.celeryproject.org/en/latest/userguide/tasks.html#automatic-naming-and-relative-imports) for more details. – sytech Mar 28 '18 at 15:05
  • Is `criptoball` in your list of installed apps in django? – 2ps Mar 28 '18 at 21:54
  • 1
    I suggest you use django-q. Is not an answer to your problem, but I can feel your pain. I used celery for a while then I gave up. Too many problems, when you solve one then you have another one to debug. With django-q is much much easier. Good luck – Karim N Gorjux Mar 29 '18 at 06:29

4 Answers4

10

I had those issues before. It's not your code. It's usually a problem with the environment. You should run everything under virtualenv, adding a requirements.txt file with the specific package versions.

There is a know issue regarding celery 4.x and django 1.x, so you should consider the packages you are using.

This tutorial will explain in detail how to build virtualenv with celery.

If you can tell me your packages versions I might try and help in a different way.

Edit:

I think its something about the way you run your celery. If we fixed the first problem, try play with this:

celery -A aenima.celery:app beat -l debug --scheduler django_celery_beat.schedulers:DatabaseScheduler

or

celery -A aenima.aenima.celery:app beat -l debug --scheduler django_celery_beat.schedulers:DatabaseScheduler

The latest error you are getting is something to do with your module discovery. Try it first.

Gal Silberman
  • 3,756
  • 4
  • 31
  • 58
  • I am using a virtualenv. I have django 1.10 and celery 4.1 . Should I upgrade django? – Alejandro Veintimilla Mar 26 '18 at 09:39
  • You should downgrade celery to 3.x. I did that in my installation and it worked perfectly. – Gal Silberman Mar 26 '18 at 09:46
  • Can I add a note that in the docs it's written as `Celery 4.0 supports Django 1.8 and newer versions. Please use Celery 3.1 for versions older than Django 1.8.`? So 4.x should be able to work with django 1.10, if anything I would suggest upgrading django to a higher version :) – King Reload Mar 26 '18 at 09:51
  • You right, but the fact is that it's still problematic. I think we still both agree that the problem is with the versions. If it was me I would rather downgrade celery the upgrade django. If your project is big, upgrading django could be a headache... – Gal Silberman Mar 26 '18 at 09:54
  • Yes, Django will give a lot of trouble when you try to upgrade it, but Django's versions have been growing rather quickly lately, so eventually an upgrade will be necessary either way, but yes. I agree on that it could also be the difference between versions. :) – King Reload Mar 26 '18 at 09:58
  • Thanks for your answer, edited my code trying to keep it as simple a possible. Now I dont get "no module named celery" (maybe it was solved by changing the location of celery.py). However I still don't get anything printed. – Alejandro Veintimilla Mar 27 '18 at 06:04
  • And you can't see it in your log.txt file of the deamon? – Gal Silberman Mar 27 '18 at 13:39
  • Increased the log level and I got this a clear error message. Added it as an Edit. – Alejandro Veintimilla Mar 28 '18 at 14:26
4

Using virtualenv for this would be handy.

First like @Gal said you need to make sure you have celery 4.x.

You can install this doing it through pip:

pip install celery

Of course you can also install the 4.x version adding it in your requirements.txt like so:

celery==4.1.0

Or higher versions if available in the future.

Then you could reinstall all your packages using:

  • pip install -r requirements.txt

Which will make sure you have that certain celery package installed.

Now the Celery part, although your code might not be wrong, but I will write in a way how I got my Celery app to work.

__init __.py:

from __future__ import absolute_import, unicode_literals

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery_conf import app as celery_app

__all__ = ['celery_app']

celery_conf.py:

from __future__ import absolute_import, unicode_literals

import os

from celery import Celery
from datetime import timedelta

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', '<PATH.TO.YOUR.SETTINGS>')

app = Celery('tasks')

# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks()

# Set a beat schedule to update every hour.
app.conf.beat_schedule = {
    'update-every-hour': {
        'task': 'tasks.update',
        'schedule': timedelta(minutes=60),
        'args': (16, 16),
    },
}

# The default task that Celery runs.
@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))

tasks.py:

# -*- coding: utf-8 -*-
from __future__ import unicode_literals

import requests

from django.conf import settings
from django.http import HttpResponse

from celery.task import Task
from celery.five import python_2_unicode_compatible
from celery import Celery
app = Celery()


@python_2_unicode_compatible
class Update(Task):
    name = 'tasks.update'

    def run(self, *args, **kwargs):
        # Run the task you want to do.

""" For me the regular TaskRegistry didn't work to register classes, 
so I found this handy TaskRegistry demo and made use of it to 
register tasks as classes."""
class TaskRegistry(Task):

    def NotRegistered_str(self):
        self.assertTrue(repr(TaskRegistry.NotRegistered('tasks.add')))

    def assertRegisterUnregisterCls(self, r, task):
        with self.assertRaises(r.NotRegistered):
            r.unregister(task)
        r.register(task)
        self.assertIn(task.name, r)

    def assertRegisterUnregisterFunc(self, r, task, task_name):
        with self.assertRaises(r.NotRegistered):
            r.unregister(task_name)
        r.register(task, task_name)
        self.assertIn(task_name, r)

    def task_registry(self):
        r = TaskRegistry()
        self.assertIsInstance(r, dict, 'TaskRegistry is mapping')

        self.assertRegisterUnregisterCls(r, Update)

        r.register(Update)
        r.unregister(Update.name)
        self.assertNotIn(Update, r)
        r.register(Update)

        tasks = dict(r)
        self.assertIsInstance(
            tasks.get(Update.name), Update)

        self.assertIsInstance(
            r[Update.name], Update)

        r.unregister(Update)
        self.assertNotIn(Update.name, r)

        self.assertTrue(Update().run())

    def compat(self):
        r = TaskRegistry()
        r.regular()
        r.periodic()

As I explained in the code as well, the regular taskregistry did not work thats built in the Celery 4.x, so I made use of the demo taskregistry. You can of course also not use classes to make tasks, but I prefered to use a class.

settings.py:

# Broker settings for redis
CELERY_BROKER_HOST = '<YOUR_HOST>'
CELERY_BROKER_PORT = 6379
CELERY_BROKER_URL = 'redis://'
CELERY_DEFAULT_QUEUE = 'default'

# Celery routes
CELERY_IMPORTS = (
    'PATH.TO.tasks' # The path to your tasks.py
)

CELERY_DATABASE_URL = {
    'default': '<CELERY_DATABASE>', # You can also use your already being used database here
}

INSTALLED_APPS = [
    ...
    'PATH.TO.TASKS' # But exclude the tasks.py from this path
]

LOGGING = {
    ...
    'loggers': {
        'celery': {
            'level': 'DEBUG',
            'handlers': ['console'],
            'propagate': True,
        },
    }
}

I start my worker with the following commands:

redis-server --daemonize yes

celery multi start worker -A PATH.TO.TASKS -l info --beat # But exclude tasks.py from the path

I hope this information may help you or anyone out that's struggling with Celery.

EDIT:

Note that I start the worker as daemon, so you won't actually be able to see the logs in the console. For me it's logged in a .txt file.

Plus note as well the paths to use for example for some you need to include the path to your app like so:

project.apps.app

And for other cases you need to include the tasks.py without the .py as well, I wrote down when to exclude this file and when not to.

EDIT 2:

The @shared_task decorator returns a proxy that always uses the task instance in the current_app. This makes the @shared_task decorator useful for libraries and reusable apps, since they will not have access to the app of the user.

Notice that @shared_task does not have access to the app of the user. The app you're currently trying to register doesn't have access to your app. The method you actually want to use to register a task is:

from celery import Celery
app = Celery()

@app.task
def test_celery(x, y):
    logger = logging.getLogger("AENIMA")
    print("EUREKA")
    logger.debug("EUREKA")
King Reload
  • 2,780
  • 1
  • 17
  • 42
  • Thanks for your answer ... changed my code, do you think it has to print something like that? How should I know if the test is being executed? – Alejandro Veintimilla Mar 27 '18 at 06:05
  • @alejoss if you want to see something in the console, then you first need to run redis as daemon `redis-server --daemonize yes` and then the celery worker not as daemon `celery -A aenima -l info --beat`, you also have to enable the celery logger in your django settings, you'll have to look up how to set up the celery logger. – King Reload Mar 27 '18 at 07:00
  • Added an edit and I think I got closer to make it work. A new error came out. Please take a look at it if you can. – Alejandro Veintimilla Mar 28 '18 at 14:26
  • @alejoss I gave you an entire setup of how to register a task, your task is just not registered which should be easily solved googling? – King Reload Mar 28 '18 at 14:27
  • Please do not recommend Celery 4.1--it has bugs that prevent period tasks from running properly. – 2ps Mar 28 '18 at 21:56
  • @2ps the way I wrote it makes a periodic task run. – King Reload Mar 29 '18 at 00:01
  • It is a bug in celery 4.1--it won't run consistently at the right time for OP. – 2ps Mar 29 '18 at 00:44
  • I'm accepting this answe cause it was the one that helped me the most. Althought I'd like to divide the bounty ... SO should let us do that! – Alejandro Veintimilla Mar 30 '18 at 19:01
  • @alejoss I'm glad my answer was able to help out, but your problem hasn't been solved yet, has it? I'm interested in solving the problem you currently have with Celery if possible :) – King Reload Mar 30 '18 at 21:44
  • @KingReload , no, after a while it came back to the same error. `KeyError: u'aenima.criptoball.tasks.test_celery'` – Alejandro Veintimilla Apr 04 '18 at 10:20
  • @alejoss maybe it would be better to put the `tasks.py` in the same folder as the `celery.py`? because my celery structure is also like that. Also instead of `@shared_task` you might want to use `@app.task`. On which then you will need `from celery import Celery` & `app = Celery()` for the `app`. – King Reload Apr 04 '18 at 11:40
3

Received unregistered task of type u'tasks.test_celery'. The message has been ignored and discarded.

Did you remember to import the module containing this task? Or maybe you're using relative imports?

Maybe your task path is incorrect, should be:

app.conf.beat_schedule = {
    'test-every-30-seconds': {
        'task': 'criptoball.tasks.test_celery',
        'schedule': 30.0,
        'args': (16, 16)
    }, 
}

tasks.test_celery should be full path: criptoball.tasks.test_celery

Community
  • 1
  • 1
gushitong
  • 1,898
  • 16
  • 24
2

There is one thing you should fix, use:

app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

to tell Celery which apps' tasks do you want it to discover if you're using Celery 3.x.

unixia
  • 4,102
  • 1
  • 19
  • 23