Questions tagged [celeryd]

celeryd is the common name of a Celery worker node, running as a daemon.

Celery is a widely used asynchronous job/task queue processing engine, written in Python and available under the New BSD License. It can use a large variety of message brokers like RabbitMQ, Redis, MongoDB, CouchDB, Amazon SQS and others.

More info here: http://celery.github.com/celery/getting-started/introduction.html#overview

148 questions
12
votes
3 answers

celerybeat automatically disables periodic task

I'd like to create a periodic task for celery using django-celery's admin interface. I have a task set up which runs great when called manually or by script. It just doesn't work through celerybeat. According to the debug logs the task is set to…
jnns
  • 5,148
  • 4
  • 47
  • 74
10
votes
1 answer

Celery and Redis keep running out of memory

I have a Django app deployed to Heroku, with a worker process running celery (+ celerycam for monitoring). I am using RedisToGo's Redis database as a broker. I noticed that Redis keeps running out of memory. This is what my procfile looks like: web:…
Bilal and Olga
  • 3,211
  • 6
  • 30
  • 33
10
votes
1 answer

Celery Exception Handling

Suppose i have this task definition: def some_other_foo(input) raise Exception('This is not handled!') return input @app.task( bind=True, max_retries=5, soft_time_limit=20) def some_foo(self, someInput={}): response="" try: …
Danijel
  • 817
  • 1
  • 17
  • 31
10
votes
3 answers

In celery how to get the task status for all the tasks for specific task name?

In celery i want to get the task status for all the tasks for specific task name. For that tried below code. import celery.events.state # Celery status instance. stat = celery.events.state.State() # task_by_type will return list of tasks. query =…
Hitul Mistry
  • 2,105
  • 4
  • 21
  • 29
8
votes
1 answer

Share memory areas between celery workers on one machine

I want to share small pieces of informations between my worker nodes (for example cached authorization tokens, statistics, ...) in celery. If I create a global inside my tasks-file it's unique per worker (My workers are processes and have a…
Gregor
  • 4,306
  • 1
  • 22
  • 37
8
votes
2 answers

Celery: how to separate different environments with different workers?

I need to route all tasks of a certain django site instance to a certain queue. My setup is as following: several webservers running a Django project (1.7) one server running celery workers (3.1.7) Three environments: production, staging,…
Tino
  • 717
  • 1
  • 5
  • 20
8
votes
2 answers

Running multiple Django Celery websites on same server

I'm running multiple Django/apache/wsgi websites on the same server using apache2 virtual servers. And I would like to use celery, but if I start celeryd for multiple websites, all the websites will use the configuration (logs, DB, etc) of the last…
Tickon
  • 1,058
  • 1
  • 16
  • 25
7
votes
3 answers

Celery tries to connect to the wrong broker

I have in my celery configuration BROKER_URL = 'redis://127.0.0.1:6379' CELERY_RESULT_BACKEND = 'redis://127.0.0.1:6379' Yet whenever I run the celeryd, I get this error consumer: Cannot connect to amqp://guest@127.0.0.1:5672//: [Errno 111]…
BDuelz
  • 3,890
  • 7
  • 39
  • 62
7
votes
1 answer

Python Celery task to restart celery worker

In celery, is there a simple way to create a (series of) task(s) that I could use to automagically restart a worker? The goal is to have my deployment automagically restart all the child celery workers every time it gets a new source from github. …
Kevin Meyer
  • 2,816
  • 4
  • 21
  • 33
7
votes
1 answer

Celerybeat shuts down immediately after start

I have a django app that is using celeryd and celerybeat. Both are set up to run as daemons. The celerybeat tasks won't get executed because celerybeat does not start correctly. According to the logs it shuts down immediately: [2012-05-04…
sbaechler
  • 1,329
  • 14
  • 21
6
votes
1 answer

Problems stopping celeryd

I'm running celeryd as a daemon, but I sometimes have trouble stopping it gracefully. When I send the TERM signal and there are items in the queue (in this case service celeryd stop) celeryd will stop taking new jobs, and shut down all the worker…
Zach
  • 18,594
  • 18
  • 59
  • 68
6
votes
1 answer

Celery/Redis task expiration

I am using Celery with supervisor running the workers and Redis as the broker, and I'm having an issue with a Celery worker apparently freezing up, making it unable to process any more tasks and causing its task queue in Redis to fill up to the…
orn688
  • 830
  • 1
  • 7
  • 10
6
votes
0 answers

Celery infinite task which listens to a queue

I've got a celery task which is supposed to run in infinite loop, listening to a few queues (not related to Celery internals) in RabbitMQ. When message is retrieved from queue this long running task dispatches this message to be processed by some…
okrutny
  • 1,070
  • 10
  • 16
6
votes
1 answer

Celeryd multi with supervisord

Trying to run supervisord (3.2.2) with celery multi. Seems to be that supervisord can't handle it. Single celery worker works fine. This is my supervisord configuration celery multi v3.1.20 (Cipater) > Starting nodes... >…
gogasca
  • 9,283
  • 6
  • 80
  • 125
6
votes
2 answers

multiple workers and multiple queues on celery django daemon

I am using celery-django to queue tasks on my site backend. I am trying to create a setup where I have two queues named "low" and "high" and two workers W1 and W2. I want them to consume the tasks from the queue in the following way: W1 <-- low,…
rohan
  • 1,606
  • 12
  • 21
1
2
3
9 10