In my django project I have the following dependencies:
- django==1.5.4
- django-celery==3.1.9
- amqp==1.4.3
- kombu==3.0.14
- librabbitmq==1.0.3 (as suggested by https://stackoverflow.com/a/17541942/1452356)
In dev_settings.py:
DEBUG = False
BROKER_URL = "django://"
import djcelery
djcelery.setup_loader()
CELERYBEAT_SCHEDULER = "djcelery.schedulers.DatabaseScheduler"
CELERYD_CONCURRENCY = 2
# CELERYD_TASK_TIME_LIMIT = 10
CELERYD_TASK_TIME_LIMIT
is commented as suggested here https://stackoverflow.com/a/17561747/1452356 along with debug_toolbar
as suggested by https://stackoverflow.com/a/19931261/1452356
I start my worker in a shell with:
./manage.py celeryd --settings=dev_settings
Then I send a task:
class ExempleTask(Task):
def run(self, piProjectId):
table = []
for i in range(50000000):
table.append(1)
return None
Using a django command:
class Command(BaseCommand):
def handle(self, *plArgs, **pdKwargs):
loResult = ExempleTask.delay(1)
loResult.get()
return None
With:
./manage.py purge_and_delete_test --settings=dev_settings
I monitor the memory usage with:
watch -n 1 'ps ax -o rss,user,command | sort -nr | grep celery |head -n 5'
Every time I call the task, it increase the memory consumption of the celeryd/worker process, proportionally to the amount of data allocated in it...
It seems like a common issue (c.f. others stackoverflow link), however I couldn't fix it, even with the latest dependencies.
Thanks.