11

I'm testing celery in a local environment. My Python file has the following two lines of code:

celery_app.send_task('tasks.test1', args=[self.id], kwargs={})
celery_app.send_task('tasks.test2', args=[self.id], kwargs={})

Looking at the console output they seem to execute one after another in sequence. But test2 only runs after test1 has finished. At least this is the way it seems reading the console output.

These tasks have no dependancies on each other so I don't want one task waiting for another to complete before moving onto the next line.

How can I execute both tasks as the same time?

---- **** -----
--- * ***  * -- Darwin-14.0.0-x86_64-i386-64bit
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app:         tasks:0x104cd8c10
- ** ---------- .> transport:   sqs://123
- ** ---------- .> results:     disabled
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ----
--- ***** ----- [queues]
 -------------- .> celery           exchange=celery(direct) key=celery
Prometheus
  • 32,405
  • 54
  • 166
  • 302

2 Answers2

25

There are multiple ways to achieve this.

1. Single Worker - Single Queue.

$ celery -A my_app worker -l info  -c 2 -n my_worker

This will start a worker which executes 2 tasks at the same time.

2. Multiple workers - Single Queue.

$ celery -A my_app worker -l info  -c 1 -n my_worker1
$ celery -A my_app worker -l info  -c 1 -n my_worker2

This will start two workers which executes one task at a time. Note both tasks are in the same queue.

3. Multiple workers - Multiple Queues.

$ celery -A my_app worker -l info  -c 1 -n my_worker1 -Q queue1
$ celery -A my_app worker -l info  -c 1 -n my_worker2 -Q queue2

This will start two workers which executes one task at a time. But here you have route the tasks accordingly.

celery_app.send_task('tasks.test1', args=[self.id], kwargs={}, queue='queue1')
celery_app.send_task('tasks.test2', args=[self.id], kwargs={}, queue='queue2')

4. Single worker - All Queues

$ celery -A my_app worker -l info -n my_worker1 

If you don't mention any queue, it will consume from all queues by default.

Chillar Anand
  • 27,936
  • 9
  • 119
  • 136
  • @user1012513 You can run `celery -A my_app worker -l info -c 1 -n my_worker1 -Q queue1,queue2,queue3` – Chillar Anand Aug 21 '19 at 15:43
  • If you do not mention have -Q flag, the workers by default reads from all the queues. – Aditya Nagesh Feb 27 '20 at 09:29
  • @AdityaNagesh Yes, you are right. Updated in answer. – Chillar Anand Feb 27 '20 at 13:17
  • @ChillarAnand, in my case even when I use -c 2 in celery worker command, one task is blocking another task. The command I am using is ```celery worker -A celery_app -P gevent -l info -c 2```. Can you let me know what else should I change so that both tasks can be run in parallel? – sattva_venu Sep 16 '20 at 16:49
  • @sattva_venu did you find a solution? I'm having the same issue – Guilherme Mar 17 '21 at 17:24
7

Call the worker with --autoscale option which would scale up and down processes as required.

--autoscale AUTOSCALE
                       Enable autoscaling by providing max_concurrency,
                       min_concurrency. Example:: --autoscale=10,3 (always
                       keep 3 processes, but grow to 10 if necessary)

example.

celery -A sandbox worker --autoscale=10,0 --loglevel=info 
All Іѕ Vаиітy
  • 24,861
  • 16
  • 87
  • 111