-2

In my python neat code I am using opencv to downscale and covert into gray every frame of the environment. what I want a archive is that opencv opens a window displaying the frame/ video that is it processing.

In short I want to view the the neat algorithm learning and evolving.

Because there are 3 environment running in parallel i want opencv to display the frame/video that is best performing right now.

I am working with the python neat library to do some machine learning tasks. At the moment I am doing parallel learning with 3 threads with the environment of sonic the hedgehog. I have tried to do simple open CV frame commands, but its just opening a black window.


net = neat.nn.FeedForwardNetwork.create(self.genome, self.config)

        fitness = 0
        xpos = 0
        xpos_max = 0
        counter = 0
        imgarray = []

        while not done:
            # self.env.render()
            ob = cv2.resize(ob, (inx, iny))
            ob = cv2.cvtColor(ob, cv2.COLOR_BGR2GRAY)
            ob = np.reshape(ob, (inx, iny))

            imgarray = np.ndarray.flatten(ob)

            actions = net.activate(imgarray)

            ob, rew, done, info = self.env.step(actions)

            xpos = info['x']



This is the part of the code that downscales the frame and converts it to gray scale.

Bonus if it could only show the frame/worker that is doing the best based on the fitness value.

View full code here: https://gitlab.com/lucasrthompson/Sonic-Bot-In-OpenAI-and-NEAT/blob/master/neat-paralle-sonic.py by lucasrthompson

The output that I expect is one window that shows the frame/ video of the environment. Awesome

The built it render

self.env.render() Pops up many many windows with past and present versions of the environment.

thanks

1 Answers1

0

I am writing my own NEAT implementation and I am also testing with OpenAi gym.

You can use wrappers to record the video for you, and this will be the real video, without downscaling or changing colors:

env_wrapped = gym.make('OpenAI-env-id')
env = wrappers.Monitor(env_wrapped, dir , video_callable=record_video_function)

Where the "record_video_function" is a callable which can return true or false when you desire the episode to be recorded.

What I usually do to see the best performing genomes is:

  1. Sort the genomes by fitness
  2. Run the evaluation loop
  3. If a last species champion is next, I change a global variable to True
  4. In the "record_video_function" I return the value of this global variable, so if it's true it will enable the video recording for the episode
  5. After the episode is over, I return this global variable to False

So, with this I can see the best genome performers of last generation. You can't see the best of the current generation because there's no way to know how they will perform. If the environment is deterministic, you would be able to see the best performance in the next generation. If it's stochastic, then it may not be the best anymore.

Rulo Mejía
  • 158
  • 1
  • 5
  • That very cool :)) will have to try out how that process works with neat parallel learning (3 at a time). I don' know if you can use the env_wrapped with parallel learning but awesome. Would be awesome how you could record winning genome. – Jim Hoggey May 25 '19 at 04:14