0

I am using Django signals to trigger a task (sending mass emails to subscribers using Django celery package)when an admin post a blogpost is created from Django admin. The signal is triggered but the task function in the task file is not called. It's because I put a print function which is not printing inside the task function.

My signlas.py file:

from apps.blogs.celery_files.tasks import send_mails
from apps.blogs.models import BlogPost,Subscribers
from django.db.models.signals import post_save
from django.dispatch import receiver

def email_task(sender, instance, created, **kwargs):
    if created:
        print("@signals.py")
        send_mails.delay(5)


post_save.connect(email_task, sender=BlogPost,dispatch_uid="email_task")

My task.py file

from __future__ import absolute_import, unicode_literals
from celery import shared_task
# from celery.decorators import task
from apps.blogs.models import BlogPost,Subscribers
from django.core.mail import send_mail
from travel_crm.settings import EMAIL_HOST_USER
from time import sleep

@shared_task
def send_mails(duration,*args, **kwargs):
    print("@send_mails.py")
    subscribers = Subscribers.objects.all()
    blog = BlogPost.objects.latest('date_created')
    for abc in subscribers:
        sleep(duration)
        print("i am inside loop")
        emailad = abc.email
        send_mail('New Blog Post ', f" Checkout our new blog with title {blog.title} ",
                  EMAIL_HOST_USER, [emailad],
                  fail_silently=False)

Here. the print("@send_mails.py") is not executed but print("@signals.py") in signals.py file is executed. Hence, signals is received after the Blogpost model object is created but the function inside task.py which is send_mails is not executed.

I have installed both celery and redis server and both are working fine.

The main thing is if I remove .delay(5) from signal file and instead used just send_mails() inside email_task , it works perfectly and i am getting emails. But as soon as I add delay() function, the fucntion inside task file is not called. What is the issue??

My traceback when I run worker info:

-------------- celery@DESKTOP-AQPSFR9 v5.1.2 (sun-harmonics)
--- ***** -----
-- ******* ---- Windows-10-10.0.18362-SP0 2021-07-18 11:06:10
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app:         travel_crm:0x15c262afcd0
- ** ---------- .> transport:   redis://localhost:6379//
- ** ---------- .> results:     redis://localhost:6379/
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery


[tasks]
  . apps.blogs.celery_files.celery.debug_task
  . apps.blogs.celery_files.tasks.send_mails
  . travel_crm.celery.debug_task

[2021-07-18 11:06:11,465: INFO/SpawnPoolWorker-1] child process 9276 calling self.run()
[2021-07-18 11:06:11,475: INFO/SpawnPoolWorker-2] child process 8792 calling self.run()
[2021-07-18 11:06:11,496: INFO/SpawnPoolWorker-4] child process 1108 calling self.run()
[2021-07-18 11:06:11,506: INFO/SpawnPoolWorker-3] child process 7804 calling self.run()
[2021-07-18 11:06:13,145: INFO/MainProcess] Connected to redis://localhost:6379//
[2021-07-18 11:06:17,206: INFO/MainProcess] mingle: searching for neighbors
[2021-07-18 11:06:24,287: INFO/MainProcess] mingle: all alone
[2021-07-18 11:06:32,396: WARNING/MainProcess] c:\users\user\desktop\travelcrm\myvenv\lib\site-packages\celery\fixups\django.py:203: UserWarning: Using settings.DEBUG leads to a memory
            leak, never use this setting in production environments!
  warnings.warn('''Using settings.DEBUG leads to a memory

[2021-07-18 11:06:32,396: INFO/MainProcess] celery@DESKTOP-AQPSFR9 ready.
[2021-07-18 11:06:32,596: INFO/MainProcess] Task apps.blogs.celery_files.tasks.send_mails[6bbac0ae-8146-4fb0-b64b-a07755123e1d] received
[2021-07-18 11:06:32,612: INFO/MainProcess] Task apps.blogs.celery_files.tasks.send_mails[25d3b32a-f223-4ae4-812b-fa1cfaedaddd] received
[2021-07-18 11:06:34,633: ERROR/MainProcess] Task handler raised error: ValueError('not enough values to unpack (expected 3, got 0)')
Traceback (most recent call last):
  File "c:\users\user\desktop\travelcrm\myvenv\lib\site-packages\billiard\pool.py", line 362, in workloop
    result = (True, prepare_result(fun(*args, **kwargs)))
  File "c:\users\user\desktop\travelcrm\myvenv\lib\site-packages\celery\app\trace.py", line 635, in fast_trace_task
    tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)
[2021-07-18 11:06:34,633: ERROR/MainProcess] Task handler raised error: ValueError('not enough values to unpack (expected 3, got 0)')
Traceback (most recent call last):
  File "c:\users\user\desktop\travelcrm\myvenv\lib\site-packages\billiard\pool.py", line 362, in workloop
    result = (True, prepare_result(fun(*args, **kwargs)))
  File "c:\users\user\desktop\travelcrm\myvenv\lib\site-packages\celery\app\trace.py", line 635, in fast_trace_task
    tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)
Reactoo
  • 916
  • 2
  • 12
  • 40
  • Are you able to see the task in the queue? What status is it? What additional logging messages do you have? Maybe this answer will help too: https://stackoverflow.com/questions/48910214/celery-calling-delay-with-countdown – Daniel Butler Jul 18 '21 at 05:16
  • I have added the traceback. can you please check? @DanielButler – Reactoo Jul 18 '21 at 05:25
  • When I import send_mails function in python shell, the fucntion works but when i save Blogpost model from django admin it doesnt work(above issue) @DanielButler – Reactoo Jul 18 '21 at 08:31
  • You should add path to those files in description. – Andrew Holovko Aug 21 '21 at 16:41

1 Answers1

0

The stack trace helps identify that the issue has to do with calling the celery function.

ValueError: not enough values to unpack (expected 3, got 0)

Which is this part of the code:

send_mails.delay(5)

Try calling the function using apply_async instead.

send_mails.apply_async(args=(5, ))

If that doesn’t work remove *arg and **kwargs from def send_mails(duration):. I did not see why those parameters are necessary.

More information can be found in this answer: https://stackoverflow.com/a/48910727/7838574

Or in the Docs here: https://docs.celeryproject.org/en/latest/userguide/calling.html#basics

Daniel Butler
  • 3,239
  • 2
  • 24
  • 37
  • Hello @Daniel, I tried removing *args and **kwargs and also apply apply_async as you said, but no change in result. It is still not working. – Reactoo Jul 18 '21 at 14:20
  • no, there is no any error @dani but task file function is not called, same problem – Reactoo Jul 18 '21 at 16:47