4

Is there a way to limit queue size when I run Celery with Redis backend?

Something like x-max-length in queue predeclare for rabbitmq

Maksym Polshcha
  • 18,030
  • 8
  • 52
  • 77

3 Answers3

1

As far as I know that is not possible with Redis as backend.

DejanLekic
  • 18,787
  • 4
  • 46
  • 77
0

I think you are looking for prefetch limits in celery. Check it out docs.

Bharat Gera
  • 800
  • 1
  • 4
  • 13
  • The prefetch limit is a limit for the number of tasks a worker can reserve for itself. I need to limit my queues. – Maksym Polshcha Oct 09 '19 at 13:59
  • That's interesting! Limiting a queue for executing the task, I haven't come across anything like that before. Please do share if you find out the way for the same. – Bharat Gera Oct 09 '19 at 15:36
0

This might be a bit hacky but you could try using a Redis lock when calling the task. That way if another process wants to call the task, it would have to wait for the Redis lock to be released (this would happen when either the task is done running or when the timeout is reached). This would prevent too many tasks being added to the queue

 with r.lock(some_lock_name, blocking_timeout=10):
    your_celery_task.delay()
luisgc93
  • 136
  • 3
  • 12