0

I'm kombu/celery setup that I've recently deployed. The tasks execute fine, but appear to run almost continuously rather than respecting the run_every attribute. Perhaps more oddly, this behavior only appears in production and works fine locally.

My tasks.py looks like this:

from celery.task import  PeriodicTask
from datetime import timedelta, datetime

class FirstTask(PeriodicTask):
    run_every = timedelta(seconds = 30)
    # Do Stuff    

My settings.py includes

BROKER_URL = "django://"

import djcelery  
djcelery.setup_loader() 

plus 'djcelery' and 'kombu.transport.django'in INSTALLED_APPS.

And in production I run python manage.py celeryd -v 2 -B -s celery -E -l INFO to start running my tasks. I had been following Chase Seibert's tutorial if that clears anything else up.

Chris
  • 555
  • 2
  • 9
  • 28

2 Answers2

1

How long does the task take to finish? And have you tried clearing your queue before trying it again (celery might not respect the run_every setting, if you have already tasks in your queue...)

Bernhard Vallant
  • 49,468
  • 20
  • 120
  • 148
  • Ha ha: `Purged 5625 messages from 1 known task queue.` That probably has something to do with it. How do they pile up like that? – Chris Aug 01 '12 at 08:46
  • Did you try it with different intervals or different code before? Or did the task take too long to execute? Also if you have a lot of other celery tasks they could delay the execution of your periodic task (don't take the `run_every` interval for granted!) – Bernhard Vallant Aug 01 '12 at 08:54
0

why do you include 'kombu.transport.django' in your INSTALLED_APPS..? I thought celery and kombu are providing the same functionality.

Jeff Sheffield
  • 5,768
  • 3
  • 25
  • 32