4

My celery tasks don't send an email to application admins each time I call logger.critical.

I'm building a Django application. The current configuration of my project, allows the admins of the application to receive an email each time a logger.critical message is created. This was pretty straight forward to set up, I just followed the documentation on both projects (celery and Django). For some reason, which I'm not figuring out, the code that runs inside a celery task, does not have the same behavior, It doesn't send an email to the application admin each time the logger.critical message is created.

Does celery actually even allow this to be done? Am I missing some configuration? Does anyone got this problem and was able to solve it?

Using:

  • Django 1.11
  • Celery 4.3

Thanks for any help.

luistm
  • 1,027
  • 4
  • 18
  • 43
  • I have a checklist I check every time. Does the way I import the celery app also import my whole correct Django settings? You can check your logging settings by printing them inside your tasks to check it out. Is the celery task code path in one of my installed apps? Are the `__init__.py` files correct? Am I correctly addressing the task in scheduler or have I imported the task correctly where I am using it? – Ali Asgari Oct 27 '19 at 05:52
  • What is your code? It's harder to get help without showing the code that doesn't work as expected :) At minimal, someone would try first to reproduce the issue that you have, then suggest a working solution. – babis21 Nov 13 '19 at 11:24
  • @babis21 you can reproduce by making a logger critical inside a celery task, on any project with Django logger configured to send admin emails. Thanks – luistm Nov 13 '19 at 12:08
  • for this feature better use sentry except implement it. sentry easy to use and send very good info about error occurred in django web application. – Erfan Nov 14 '19 at 06:54
  • how you are calling the task can we see – Yugandhar Chaudhari Nov 14 '19 at 08:22
  • @Erfan isn't sentry a paid tool? – luistm Nov 14 '19 at 10:42
  • could you share code? – Alex Zaitsev Nov 14 '19 at 15:45

2 Answers2

8

As stated in the documentation Celery overrides the current logging configuration to apply its own, it also says that you can set CELERYD_HIJACK_ROOT_LOGGER to False in your Django settings to prevent this behavior, what is not well documented is that this is not really working at the moment.

In my opinion you have 2 options:

1. Prevent Celery to override your configuration (really) using the setup_logging signal

Open your celery.py file and add the following:

from celery.signals import setup_logging

@setup_logging.connect
def config_loggers(*args, **kwags):
    pass

After that your file should look more or less like this:

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from celery.signals import setup_logging

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproject.settings')

app = Celery('myproject')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()

@setup_logging.connect
def config_loggers(*args, **kwags):
    pass

However I would avoid this option unless you have a really good reason because in this way you will lose the default task logging handled by Celery, which is quite good to have.

2. Use a specific logger

You can define a custom logger in your Django LOGGING configuration and use it in your task, eg:

Django settings:

LOGGING = {
    # ... other configs ...
    'handlers': {
        'my_email_handler': {
            # ... handler configuration ...
        },
    },
    'loggers': {
        # ... other loggers ...
        'my_custom_logger': {
            'handlers': ['my_email_handler'],
            'level': 'CRITICAL',
            'propagate': True,
        },
    },
}

Tasks:

import logging

logger = logging.getLogger('my_custom_logger')

@shared_task
def log():
    logger.critical('Something bad happened!')

I believe this is the best approach for you because, as far as I understand, you need to manually log messages, and this allows you to keep using the Celery logging system.

bug
  • 3,900
  • 1
  • 13
  • 24
-3

From Celery docs:

By default any previously configured handlers on the root logger will be removed. If you want to customize your own logging handlers, then you can disable this behavior by setting worker_hijack_root_logger = False.

Celery installs own logger that you can obtain using get_task_logger() call. I assume you wrote your own logger that does the logic you described in the original question. Read more about Celery logging to find out how to disable this behaviour and tweak Celery to your needs.

DejanLekic
  • 18,787
  • 4
  • 46
  • 77
  • Hi, thanks for your answer. I'm using the default Django logger, either in the web app or in celery tasks. – luistm Nov 08 '19 at 15:42