Here is my program with multiprocessing for processing some images:
import multiprocessing as mp
import tqdm
def image_process(path):
# ...
return
def main():
images = [# some images' dir path]
result = []
pool = mp.Pool(mp.cpu_count())
for temp in tqdm.tqdm(pool.imap(images), total=len(images)):
result.append(temp)
if __name__ = "__main__":
main()
It works very well when I test it in PyCharm, no crash. But once I compile it with pyinstaller, and run it with the .exe file, it draws all my 32GB ram in a short time and then my Windows crash. The tqdm bar stays at 0 through out the run. It crashes so fast that I have no time to do any debug on it. I only found that the number of "Processes" in the "CPU" page of Windows Task Manager increase rapidly. Seems the program just keep creating new process without deleting those outdated.
Here are my trying:
pool.close()
pool.join
I read some questions of this problem on Stack Overflow, and tried to add these two line in the 'for' loop after append(), but the problem still exist.