I have been trying to set up my scheduled task to run every x minutes, but I am not making any progress. I read the documentation and tried different approaches to the problem but I am obviously still doing something…
Celery currently has parameters for a broker and backend. Am I able to add even more nodes?
I have 5 nodes. How would I reference them in the configuration celeryapp.py
app = Celery('mytaks',
broker='redis://clusternode1/0',
…
I'm trying to set up Django-celery-beat in order to create periodic tasks.
My configuration is as follows:
from celery import Celery
import os
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'config.settings.local')
celery =…
I can't init mi celeryd and celerybeat service, I used the same code on another enviroment (configuring everything from the start) but here don't work. I think this was by permissions but I could'nt run it. please help me.
this is my celery conf on…
I am trying to set up a server to run multiple workers on Ubuntu using celery. Set up the daemon using the generic scripts and using rabbit-mq as the broker.
celery==3.1.23
django-celery==3.1.17
django-celery-beat==1.0.1
/etc/default/celeryd -…
Helo, i have created a periodic task in database using django-celery-beat
then try to run celery with beat by using next command
celery -A proj worker -B -l info
but beat don't send task to celery
Add Debug level log, maybe some one know how to fix…
I need following workflow for my celery tasks.
when taskA finishes with success I want to execute taskB.
I know there is signal @task_success but this returns only task's result, and I need access to parameters of previous task's arguments. So I…
I want to create multiple copies of an image and resize them using celery after the original image was sent.
def save_model(self, request, obj, form, change):
updated = change
super().save_model(request, obj, form, change)
…
How can i setup Celery in Production Server using aws or digitalocean and broker as redis or rabbitmq.
Please elaborate more on how we can resilience over the connection refused error while the broker is down.
backgroud
The system is Centos7, which have a python2.x. 1GB memory and single core.
I install python3.x , I can code python3 into python3.
The django-celery project is based on a virtualenv python3.x,and I had make it well at nginx,uwsgi,mariadb.…
I have the following code
from utils import SendSMS
from celery.exceptions import *
@celery.task(bind=True, max_retries=3)
def send_sms(self,sms_list):
failed_items = []
for sms_item in sms_list:
status = SendSMS( **sms_item )
…
After upgrading to Celery 4.1.0 I got this error. While running my Django app.
Missing connection string! Do you have the database_url setting set to a real value?
I would like to verify that setting time limits for celery tasks work.
I currently have my configuration looking like this:
CELERYD_TASK_SOFT_TIME_LIMIT = 30
CELERYD_TASK_TIME_LIMIT = 120
task_soft_time_limit = 29
task_time_limit = 44_LIMIT =…
I have a receiver for post_save signal that detects and restricts multiple active access tokens from the same user.
In addition to this, I want to lock that detected user for a minute and reactivate the user once again by using the is_active field…