Questions tagged [celery-task]

celery-task is a building block of Celery distributed task queue system

celery-task is a building block of Celery distributed task queue system. Basically, it is a python class, but it can be created out of any callable by @celery.task decorator.

727 questions
4
votes
1 answer

Django 1.6 + RabbitMQ 3.2.3 + Celery 3.1.9 - why does my celery worker die with: WorkerLostError: Worker exited prematurely: signal 11 (SIGSEGV)

This seems to address a very similar issue, but doesn't give me quite enough insight: https://github.com/celery/billiard/issues/101 Sounds like it might be a good idea to try a non-SQLite database... I have a straightforward celery setup with my…
tadasajon
  • 14,276
  • 29
  • 92
  • 144
4
votes
1 answer

Have Celery broadcast return results from all workers

Is there a way to get all the results from every worker on a Celery Broadcast task? I would like to monitor if everything went ok on all the workers. A list of workers that the task was send to would also be appreciated.
RickyA
  • 15,465
  • 5
  • 71
  • 95
4
votes
3 answers

Python Celery socket.error: [Errno 61] Connection refused

I am using Celery 3.0 and have the configuration file like below. celeryconfig.py BROKER_URL = 'redis://localhost:6379/0' CELERY_RESULT_BACKEND = 'redis://localhost:6379/0' CELERY_TASK_SERIALIZER = 'json' CELERY_RESULT_SERIALIZER =…
tuna
  • 6,211
  • 11
  • 43
  • 63
4
votes
1 answer

Python Celery Task finished without backend

I'm using Celery 3.0.12. I have two queues: Q1, Q2. In general I put the main task in Q1, which call then subtasks that go in Q2. I don't want to store any results for the subtasks. So my subtasks have the decorator …
foobar
  • 507
  • 1
  • 8
  • 20
3
votes
1 answer

Celery is not consuming with two tasks together

I have celery snippets as follows: first define the Task Class in celery_tasks.tasks.py import celery class LoggerDefine(celery.Task): name = 'message-logger' def run(self, payload): pass class…
Aaron
  • 61
  • 9
3
votes
0 answers

Dynamic queues in Celery conditional routing

I have around 1500K URLs to GET every day. If the HTTP request times out, I repeat it up to two times. At the end of the procedure, I run a simple processing task on the body and then store the result in a Postgres instance. I'm new to celery. I…
giada
  • 33
  • 7
3
votes
3 answers

After Changing Python Version 3.6 to 3.10 I got cannot import name 'Callable' from 'collections'

File "C:\Users\Codertjay\PycharmProjects\Teems_App_Kid\teems_app_kid\__init__.py", line 5, in from .celery import app as celery_app File "C:\Users\Codertjay\PycharmProjects\Teems_App_Kid\teems_app_kid\celery.py", line 3, in
Codertjay
  • 588
  • 8
  • 13
3
votes
0 answers

Track progress of members of a chord

Is there a way to get the task_ids of all members of a chord? My goal is to track to the progress of each tasks of the chords. Here is my attempt so far: import random import time from typing import List import celery from celery import Celery,…
nicoco
  • 1,421
  • 9
  • 30
3
votes
3 answers

AttributeError: 'PathDistribution' object has no attribute 'name'

I am trying to run a simple workflow using celery and using this documentation. I am using chain to sequentially run the tasks, with following workflow Extract a file, tokenize it and load JSON dumps of sentence tokens of a doc to another…
Vineet
  • 723
  • 4
  • 12
  • 31
3
votes
3 answers

python celery monitoring events not being triggered

I have a following project directory: azima: __init.py main.py tasks.py monitor.py tasks.py from .main import app @app.task def add(x, y): return x + y @app.task def mul(x, y): return x * y @app.task def xsum(numbers): …
Azima
  • 3,835
  • 15
  • 49
  • 95
3
votes
1 answer

Is Celery designed to run tasks that can execute when data is pushed in a RabbitMQ queue and consumed?

Architecture I am planning to publish data from IoT nodes via MQTT into a RabbitMQ Queue. The data is then processed and the state needs to be saved into Redis. Current Implementation I spun up a docker container for RabbitMQ and configured it to…
Shan-Desai
  • 3,101
  • 3
  • 46
  • 89
3
votes
1 answer

Run celery tasks concurrently using pytest

I'm trying to integration test a concurrent celery task in my Django app. I want the task to actually run concurrently on a worker during the pytest integration test but I'm having trouble making that work. Let's say I have the following basic…
Johnny Metz
  • 5,977
  • 18
  • 82
  • 146
3
votes
1 answer

Call DRF ViewSet via Celery task

I have a Django Rest Framework ViewSet: MyModelViewSet(generics.RetrieveUpdateDestroyAPIView): def perform_destroy(self, instance): # do something besides deleting the object Now I'm writing a Celery periodic task that deletes expired…
Noam Gal
  • 1,124
  • 1
  • 11
  • 15
3
votes
1 answer

How to log exception using autoretry in Celery tasks

From celery documentation, If you want to automatically retry on any error, simply use: @app.task(autoretry_for=(Exception,)) def x(): ... How do we log the exception that it retried for? It would easy to add a log if we have a try-except, for…
3
votes
2 answers

Celery apply_async pass kwargs to all tasks in chain

A Celery task queue to calculate result of (2 + 2) - 3. @app.task() def add(**kwargs): time.sleep(5) x, y = kwargs['add'][0], kwargs['add'][1] return x + y @app.task() def sub(**kwargs): time.sleep(5) x = args[0] y =…
Gh0sT
  • 317
  • 5
  • 16