0

i use django with celery and redis to work with asynchronous tasks. I have three task defined which should run in your own queue.

My project structure looks like this:

django-project
   |- api
      |- task.py
      |- view.py
   |- django-project
      |- settings.py
      |- celery.py
      |- __init__.py

My tasks defined in the task.py in my api app:

@shared_task
def manually_task(website_id):
    print("manually_task");
    website = Website.objects.get(pk=website_id)
    x = Proxy(website, "49152")
    x.startproxy()
    x = None


@periodic_task(run_every=(crontab(hour=19, minute=15)), ignore_result=True)
def periodically_task():
    websites = Website.objects.all()

    for website in websites:
        x = Proxy(website, "49153")
        x.startproxy()
        x = None


@shared_task
def firsttime_task(website_id):
    website = Website.objects.get(pk=website_id)
    x = Proxy(website, "49154")
    x.startproxy()
    x = None

Now here is my init.py

__all__ = ('celery_app',)

and the celery settings in the settings.py:

CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Europe/Berlin'
CELERY_DEFAULT_QUEUE = 'red'
CELERY_TASK_QUEUES = (
    Queue('red', Exchange('red'), routing_key='red'),
)
CELERY_ROUTES = {
    'api.tasks.manually_task': {'queue': 'red'},
}

My celery.py looks like this:

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'django-project.settings')

app = Celery('django-project')

app.config_from_object('django.conf:settings', namespace='CELERY')

app.autodiscover_tasks()

This was my settings. Now i start all the needed stuff (every line in own terminal):

redis-server
celery -A django-project worker -Q red
python3 manage.py runserver 0.0.0.0:8000

All starts without problems. In the view i called the task like this: manually_task.delay(webseite.pk)

But in the worker nothings do. If i try this without the CELERY_TASK_QUEUES, CELERY_DEFAULT_QUEUE and CELERY_ROUTES settings in the settings.py and start the worker normal with celery -A django-project worker it works fine. What i do wrong?

Basti G.
  • 411
  • 1
  • 5
  • 26

1 Answers1

1

manually_task.delay(webseite.pk) will send the task to the default queue. Since your worker is subscribed to the red queue, I assume there are no workers subscribed to the default queue, therefore the task does not get executed.

Try the following instead: manually_task.apply_async(webseite.pk, queue="red")

DejanLekic
  • 18,787
  • 4
  • 46
  • 77
  • This returns `manually_task() got an unexpected keyword argument 'queue'`. – Basti G. Oct 03 '19 at 10:55
  • I i try to start two workers, for example like this: `celery -A django-project worker -Q manually_task --concurrency=1` and `celery -A django-project worker -Q firsttime_task --concurrency=1`, i get an error for the first started worker `Probably the key ('_kombu.binding.reply.celery.pidbox') has been removed from the Redis database`.. why? – Basti G. Oct 03 '19 at 11:29
  • 1
    I solved the issue `manually_task() got an unexpected keyword argument 'queue'`. delay() not provided arguments like `queue`, i use `apply_async` – Basti G. Oct 03 '19 at 11:39
  • Ok but why i get the error when i start the workers with a different queue? – Basti G. Oct 03 '19 at 12:31
  • You did not mention any error. All you said is task does not get executed. If you have some other worker subscribed to a different queue, then you need to make sure that related tasks are sent to that queue. – DejanLekic Oct 03 '19 at 12:55
  • Thats not correct. I wrote about the error in the comments under this answer. And yes your solution works, thanks! But why i get the error when i starting two workers? – Basti G. Oct 03 '19 at 13:53