0

I am trying to run 1200 iterations of a function with different values using multiprocessing. Is there a way I can set the priority and affinities of the processors within the function itself ? Here is an example of what I am doing :

with multiprocessing.Pool(processes=3) as pool: 
r = pool.map(func, (c for c in combinations))

I want each of the 3 processes to have high priority using psutil, and the cpu_affinity to be specified. While I can use: psutil.Process().HIGH_PRIORITY_CLASS withing func, how should I specify different affinities for the three processors?

Adi
  • 1
  • 1

1 Answers1

-1

I would use the initializer function in mp.Pool:

#same prio for each child
def init_priority(prio_level):
    set_prio(prio_level)

if __name__ == "__main__":
    with Pool(nprocs, init_priority, (prio_level,)) as p:
        p.map(...)

#Different prio for each child: (this may not be very useful 
#because you cannot choose which child will accept each "task").

def init_priority(q):
    prio_level = q.get()
    set_prio(prio_level)

if __name__ == "__main__":
    q = mp.Queue()
    for _ in range(nprocs): #put one prio_level for each process
        q.put(prio_level)
    with Pool(nprocs, init_priority, (q,)) as p:
        p.map(...)

If you need to have some high priority child processes and some low priority, and will need to be able to discern easily between them, I would skip mp.Pool, and just use your own Process's.

Aaron
  • 10,133
  • 1
  • 24
  • 40