I am implementing NEAT (evolutionary algorithm for NN topology), and want to run feed forward network evaluations in parallel as it is the bottleneck during training. I am using the MLAgents library to connect with Unity where simulations are ran to evaluate fitness.
My problem is that when creating processes, the MLAgent environment connection is duplicated when the memory stack is copied to each process, which leads to the original connection to be closed. The connection object is completely irrelevant to the task performed during multiprocessing. How can i keep the connection object from being included in each of the subprocesses?
I have tried to separate the multiprocessing code into its own file as seen below, but it behaves the same.
from multiprocessing import Queue
import multiprocessing as mp
def get_action(network, obs, agent_num, queue):
queue.put([agent_num, network.activate(obs)])
def get_actions(policies,
fixed_policy,
fixed_opponent,
nn_input,
decision_steps_blue,
decision_steps_purple,
agent_count,
local_to_agent_map):
# Concurrency things
num_workers = mp.cpu_count()
print("CPU Cores: " + str(num_workers))
pool = mp.Pool(processes=num_workers) # Problem: Unity connection (MLAgents) being duped
q = Queue()
for agent in range(agent_count):
if local_to_agent_map[agent] in decision_steps_purple or local_to_agent_map[agent] in decision_steps_blue:
if local_to_agent_map[agent] in decision_steps_blue or not fixed_opponent:
policy = policies[agent]
elif fixed_opponent:
policy = fixed_policy
pool.apply_async(get_action, args=(policy, nn_input[agent], agent, q))
pool.close()
pool.join()
return q
The connection object is defined globally in the python script that calls the method above.
from mlagents_envs.environment import UnityEnvironment
env = UE(seed=1, side_channels=[]) # Object to avoid duplicating
Im sorry if this is a duplicate, i could not find any posts about excluding objects from memory of subprocesses, only how to share objects between processes which is not what i am looking for. I would greatly appreciate any help i could get!