5

I need to start multiple workers for only one of my queues (the "high" priority one below). How can I do this in the context of a worker script that I am using to start my workers?

from config import Config
from redis import from_url
from rq import Worker, Queue, Connection

listen = ['high','mid','low']

conn = from_url(Config.REDIS_URL)

if __name__ == '__main__':
    with Connection(conn):
        worker = Worker(list(map(Queue, listen)),log_job_description=True,)
        worker.work()

The worker script itself is being called by a supervisor process that spawns off 2 worker instances for each of my queue.


[supervisord]

[program:worker]
command=python -m worker
process_name=%(program_name)s-%(process_num)s
numprocs=2
directory=.
stopsignal=TERM
autostart=true
autorestart=true
stdout_logfile=/dev/stdout
stdout_logfile_maxbytes=0
redirect_stderr=true

If I want to have 3 workers ready for my "high" queue but only 2 for the "mid" and "low" queues, how do I go about achieving this?

I tried starting workers in "burst" mode but that kills the workers also if there are not enough jobs. I could live with a solution that autoscales the workers like burst but keeps atleast ONE worker alive at all times.

Anubhav
  • 545
  • 3
  • 14

0 Answers0