My main goal is to open 30 child process from the parent process and then open unknown number of new processes from each of those 30 child processes. I am going to call redis for some location data from those new child processes and I am not sure how many times I have to call, it would be 100 or more than 1000. When I am calling more than 1000 times, I am crossing page limit, Error is:
OSError: [Errno 24] Too many open files
I don't want to manually increase the page limit on the production server. I want to put a throttle on the process creation, so that in no way it has more than 1000 connections open.
Here is my template code:
import multiprocessing
import time
from multiprocessing.dummy import Pool
from random import randint
class MultiProcessing():
def second_calculation(self, index1, index2):
random = randint(1, 10)
time.sleep(random)
print("Slept for: {} seconds".format(random))
print("Call done: index: {} | index2: {}".format(index1, index2))
def calculation(self, index):
child_process = list()
random = randint(1, 5)
time.sleep(random)
print("Slept for : {} seconds".format(random))
counter = 0
for i in range(0, 1500):
counter += 1
new_child_process = multiprocessing.Process(target=self.second_calculation, args=(index, counter))
child_process.append(new_child_process)
new_child_process.start()
for process in child_process:
process.join()
print("Request done: {}".format(index))
if __name__ == '__main__':
index = 0
parent_process = list()
m = MultiProcessing()
for i in range(0, 30):
index += 1
print("Index: {}".format(index))
new_process = multiprocessing.Process(target=m.calculation, args=(index,))
parent_process.append(new_process)
new_process.start()
for process in parent_process:
process.join()
Thank you.