I have multiprocessing.managers.ListProxy object. Each ListProxy contain numpy ndarray. After concatenate and save, the file is almost 700Mb.
To this time i made concatenate and save to file in child process but the parent join() has to wait for finishing child process. The child process which concatenate and save a file takes 5 times longer time then computation those lists.
I thing the subprocess.Popen() is solution for problem with long time execution.
How to pass large multiprocessing.managers.ListProxy (700Mb) to subprocess.Popen?
Is that a fast way to do it with json.load? Can I pass ListProxy to json.load?