Questions tagged [celery-task]

celery-task is a building block of Celery distributed task queue system

celery-task is a building block of Celery distributed task queue system. Basically, it is a python class, but it can be created out of any callable by @celery.task decorator.

727 questions
19
votes
3 answers

How to limit the maximum number of running Celery tasks by name

How do you limit the number of instances of a specific Celery task that can be ran simultaneously? I have a task that processes large files. I'm running into a problem where a user may launch several tasks, causing the server to run out of CPU and…
Cerin
  • 60,957
  • 96
  • 316
  • 522
19
votes
2 answers

Retrying celery failed tasks that are part of a chain

I have a celery chain that runs some tasks. Each of the tasks can fail and be retried. Please see below for a quick example: from celery import task @task(ignore_result=True) def add(x, y, fail=True): try: if fail: raise…
Andrei
  • 219
  • 2
  • 8
18
votes
1 answer

celery tasks with long eta (8+ hours) are executed multiple times in a row when eta is reached

I'm creating a task with eta ranging between 3 and 20 hours and when I look at the worker log, for this task, the worker says "Got task from broker: ..." every hour after the original task was received until the eta is reached. I know that this has…
user1713317
  • 181
  • 1
  • 3
17
votes
1 answer

In celery, what would be the purpose of having multiple workers process the same queue?

In the documentation for celeryd-multi, we find this example: # Advanced example starting 10 workers in the background: # * Three of the workers processes the images and video queue # * Two of the workers processes the data queue with loglevel…
chaimp
  • 16,897
  • 16
  • 53
  • 86
17
votes
1 answer

Where do you set the task_id of a celery task?

I am having trouble finding any example of setting a task_id with my own task_id something along these lines... def testview1(request): for i in xrange(0,1000): result = add.delay( i, 4,task_id = i) print result.info …
michael
  • 2,577
  • 5
  • 39
  • 62
16
votes
9 answers

Cannot start Celery Worker (Kombu.asynchronous.timer)

I followed the first steps with Celery (Django) and trying to run a heavy process in the background. I have RabbitMQ server installed. However, when I try, celery -A my_app worker -l info it throws the following error File "
CodeSsscala
  • 729
  • 3
  • 11
  • 23
16
votes
2 answers

How does Celery work?

I have recently started working on distributed computing for increasing the computation speed. I opted for Celery. However, I am not very familiar with some terms. So, I have several related questions. From the Celery docs: What's a Task…
jeldikk
  • 353
  • 2
  • 12
16
votes
5 answers

how to remove task from celery with redis broker?

I Have add some wrong task to a celery with redis broker but now I want to remove the incorrect task and I can't find any way to do this Is there some commands or some api to do this ?
timger
  • 944
  • 2
  • 13
  • 31
16
votes
2 answers

In celery, how to ensure tasks are retried when worker crashes

First of all please don't consider this question as a duplicate of this question I have a setup an environment which uses celery and redis as broker and result_backend. My question is how can I make sure that when the celery workers crash, all the…
aqs
  • 5,632
  • 3
  • 24
  • 24
15
votes
4 answers

Why does celery return a KeyError when executing my task?

I keep getting this keyError. I'm sending strings and id (integers) to the task function, so I don't think it is serialization issue. Also it says the keyerror is on the path to the function itself, not the contents. Please help. Tasks.py: from…
Nathaniel Tucker
  • 577
  • 1
  • 5
  • 17
15
votes
2 answers

Retrieve result from 'task_id' in Celery from unknown task

How do I pull the result of a task if I do not know previously which task was performed? Here's the setup: Given the following source('tasks.py'): from celery import Celery app = Celery('tasks', backend="db+mysql://u:p@localhost/db", broker =…
Marc
  • 4,820
  • 3
  • 38
  • 36
15
votes
1 answer

Retrieving GroupResult from taskset_id in Celery?

I am starting a set of celery tasks by using celery group as described in the official documentation I am also storing the group (taskset) id into a db, in order to poll celery for the taskset state. job = group([ single_test.s(1, 1), …
Andrea
  • 3,627
  • 4
  • 24
  • 36
14
votes
4 answers

Celery CRITICAL/MainProcess] Unrecoverable error: AttributeError("'float' object has no attribute 'items'",)

I've been running a flask application with a celery worker and redis in three separated docker containers without any issue. This is how I start it: celery worker -A app.controller.engine.celery -l info --concurrency=2 --pool eventlet Celery starts…
magnoz
  • 1,939
  • 5
  • 22
  • 42
14
votes
1 answer

Flask SERVER_NAME setting best pratices

Since my app has background tasks, I use the Flask context. For the context to work, the Flask setting SERVER_NAME should be set. When the SERVER_NAME is set the incoming requests are checked to match this value or the route isn't found. When…
14
votes
1 answer

Send email task with correct context

This code is my celery worker script: from app import celery, create_app app = create_app('default') app.app_context().push() When I try to run the worker I will get into this error: File "/home/vagrant/myproject/venv/app/mymail.py", line 29, in…
anvd
  • 3,997
  • 19
  • 65
  • 126
1
2
3
48 49