I'm using multiprocessing Pool to manage tesseract processes (OCRing pages of microfilm). Very often in a Pool of say 20 tesseract processes a few pages will be more difficult to OCR, and thus these processes are taking much much longer than the other ones. In the mean time, the pool is just hanging and most of the CPUs are not being leveraged. I want these stragglers to be left to continue, but I also want to start up more processes to fill up the many other CPUs that are now lying idle while these few sticky pages are finishing up. My question: is there a way to load up new processes to leverage those idle CPUs. In other words, can the empty spots in the Pool be filled before waiting for the whole pool to complete?
I could use the async version of starmap and then load up a new pool when the current pool has gone down to a certain number of living processes. But this seems inelegant. It would be more elegant to automagically keep slotting in processes as needed.
Here's what my code looks like right now:
def getMpBatchMap(fileList, commandTemplate, concurrentProcesses):
mpBatchMap = []
for i in range(concurrentProcesses):
fileName = fileList.readline()
if fileName:
mpBatchMap.append((fileName, commandTemplate))
return mpBatchMap
def executeSystemProcesses(objFileName, commandTemplate):
objFileName = objFileName.strip()
logging.debug(objFileName)
objDirName = os.path.dirname(objFileName)
command = commandTemplate.substitute(objFileName=objFileName, objDirName=objDirName)
logging.debug(command)
subprocess.call(command, shell=True)
def process(FILE_LIST_FILENAME, commandTemplateString, concurrentProcesses=3):
"""Go through the list of files and run the provided command against them,
one at a time. Template string maps the terms $objFileName and $objDirName.
Example:
>>> runBatchProcess('convert -scale 256 "$objFileName" "$objDirName/TN.jpg"')
"""
commandTemplate = Template(commandTemplateString)
with open(FILE_LIST_FILENAME) as fileList:
while 1:
# Get a batch of x files to process
mpBatchMap = getMpBatchMap(fileList, commandTemplate, concurrentProcesses)
# Process them
logging.debug('Starting MP batch of %i' % len(mpBatchMap))
if mpBatchMap:
with Pool(concurrentProcesses) as p:
poolResult = p.starmap(executeSystemProcesses, mpBatchMap)
logging.debug('Pool result: %s' % str(poolResult))
else:
break