I am using Django signals to trigger a task (sending mass emails to subscribers using Django celery package)when an admin post a blogpost is created from Django admin. The signal is triggered but the task function in the task file is not called. It's because I put a print function which is not printing inside the task function.
My signlas.py file:
from apps.blogs.celery_files.tasks import send_mails
from apps.blogs.models import BlogPost,Subscribers
from django.db.models.signals import post_save
from django.dispatch import receiver
def email_task(sender, instance, created, **kwargs):
if created:
print("@signals.py")
send_mails.delay(5)
post_save.connect(email_task, sender=BlogPost,dispatch_uid="email_task")
My task.py file
from __future__ import absolute_import, unicode_literals
from celery import shared_task
# from celery.decorators import task
from apps.blogs.models import BlogPost,Subscribers
from django.core.mail import send_mail
from travel_crm.settings import EMAIL_HOST_USER
from time import sleep
@shared_task
def send_mails(duration,*args, **kwargs):
print("@send_mails.py")
subscribers = Subscribers.objects.all()
blog = BlogPost.objects.latest('date_created')
for abc in subscribers:
sleep(duration)
print("i am inside loop")
emailad = abc.email
send_mail('New Blog Post ', f" Checkout our new blog with title {blog.title} ",
EMAIL_HOST_USER, [emailad],
fail_silently=False)
Here. the print("@send_mails.py") is not executed but print("@signals.py") in signals.py file is executed. Hence, signals is received after the Blogpost model object is created but the function inside task.py which is send_mails is not executed.
I have installed both celery and redis server and both are working fine.
The main thing is if I remove .delay(5) from signal file and instead used just send_mails() inside email_task , it works perfectly and i am getting emails. But as soon as I add delay() function, the fucntion inside task file is not called. What is the issue??
My traceback when I run worker info:
-------------- celery@DESKTOP-AQPSFR9 v5.1.2 (sun-harmonics)
--- ***** -----
-- ******* ---- Windows-10-10.0.18362-SP0 2021-07-18 11:06:10
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app: travel_crm:0x15c262afcd0
- ** ---------- .> transport: redis://localhost:6379//
- ** ---------- .> results: redis://localhost:6379/
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
[tasks]
. apps.blogs.celery_files.celery.debug_task
. apps.blogs.celery_files.tasks.send_mails
. travel_crm.celery.debug_task
[2021-07-18 11:06:11,465: INFO/SpawnPoolWorker-1] child process 9276 calling self.run()
[2021-07-18 11:06:11,475: INFO/SpawnPoolWorker-2] child process 8792 calling self.run()
[2021-07-18 11:06:11,496: INFO/SpawnPoolWorker-4] child process 1108 calling self.run()
[2021-07-18 11:06:11,506: INFO/SpawnPoolWorker-3] child process 7804 calling self.run()
[2021-07-18 11:06:13,145: INFO/MainProcess] Connected to redis://localhost:6379//
[2021-07-18 11:06:17,206: INFO/MainProcess] mingle: searching for neighbors
[2021-07-18 11:06:24,287: INFO/MainProcess] mingle: all alone
[2021-07-18 11:06:32,396: WARNING/MainProcess] c:\users\user\desktop\travelcrm\myvenv\lib\site-packages\celery\fixups\django.py:203: UserWarning: Using settings.DEBUG leads to a memory
leak, never use this setting in production environments!
warnings.warn('''Using settings.DEBUG leads to a memory
[2021-07-18 11:06:32,396: INFO/MainProcess] celery@DESKTOP-AQPSFR9 ready.
[2021-07-18 11:06:32,596: INFO/MainProcess] Task apps.blogs.celery_files.tasks.send_mails[6bbac0ae-8146-4fb0-b64b-a07755123e1d] received
[2021-07-18 11:06:32,612: INFO/MainProcess] Task apps.blogs.celery_files.tasks.send_mails[25d3b32a-f223-4ae4-812b-fa1cfaedaddd] received
[2021-07-18 11:06:34,633: ERROR/MainProcess] Task handler raised error: ValueError('not enough values to unpack (expected 3, got 0)')
Traceback (most recent call last):
File "c:\users\user\desktop\travelcrm\myvenv\lib\site-packages\billiard\pool.py", line 362, in workloop
result = (True, prepare_result(fun(*args, **kwargs)))
File "c:\users\user\desktop\travelcrm\myvenv\lib\site-packages\celery\app\trace.py", line 635, in fast_trace_task
tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)
[2021-07-18 11:06:34,633: ERROR/MainProcess] Task handler raised error: ValueError('not enough values to unpack (expected 3, got 0)')
Traceback (most recent call last):
File "c:\users\user\desktop\travelcrm\myvenv\lib\site-packages\billiard\pool.py", line 362, in workloop
result = (True, prepare_result(fun(*args, **kwargs)))
File "c:\users\user\desktop\travelcrm\myvenv\lib\site-packages\celery\app\trace.py", line 635, in fast_trace_task
tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)