4

I currently have multiple python-rq workers executing jobs from a queue in parallel. Each job also utilizes the python multiprocessing module.

Job execution code is simply this:

from redis import Redis
from rq import Queue
q = Queue('calculate', connection=Redis())

job = q.enqueue(calculateJob, someArgs)

And calculateJob is defined as such:

import multiprocessing as mp
from functools import partial

def calculateJob (someArgs):
 pool = mp.Pool()
 result = partial(someFunc, someArgs=someArgs)

def someFunc(someArgs):
 //do something
 return output

So presumably when a job is being processed, all cores are automatically being utilized by that job. How does another worker processing another job in parallel execute its job if the first job is utilizing all cores already?

Anubhav
  • 545
  • 3
  • 14

1 Answers1

0

it depends how your system handles processes. Just like how opening a video + 5 more processes doesn't completely freeze your 6 core computer. Each worker is a new process. (a fork of a process really). Instead of doing a multiprocessing inside of a job, you can put each job on a queue and let rq handle multiprocessing by spawning multiple workers.

NONONONONO
  • 612
  • 1
  • 6
  • 10