1

I'm trying to use genetic algorithms to automate the design of a neural network. I'm very new to neural networks and tensorflow so excuse me if I fail to provide information or explain things correctly. I have multiple issues which I'm trying to address.

My input is an array of float values:

self.data_inputs = np.array([self.car_location, self.car_velocity, self.ball_location]).astype(np.float)

My desired output is this:

self.desired_output = np.asarray([1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0])

That is, I am trying to make the neural network's output layer, that uses softmax, generate scores close to 1 in this specific instance.

First question, what should I define the output to be (for the neural network)? Currently it is defined as:

output_layer = tensorflow.keras.layers.Dense(13, activation="softmax", name="output")

Second question, I defined my network to be generated as such:

        input_layer = tensorflow.keras.layers.InputLayer(3, name="input")
        dense_layers = []
        output_layer = tensorflow.keras.layers.Dense(13, activation="softmax", name="output")
        self.text = f"Creating new generation"
        if (len(model_array) == 0): # generate random population
            for individual in range(self.population_size):
                chosen_input = random.randint(3, 60)
                input_for_dense_layer = tensorflow.keras.layers.InputLayer(chosen_input)
                dense_layers.append(input_for_dense_layer)
                index = 1
                for i in range(random.randint(1,5)):
                    dense_layer = tensorflow.keras.layers.Dense(chosen_input, activation = "relu")
                    dense_layers.append(dense_layer)
                    chosen_input = random.randint(3, 60)
                    index += 1

                model = tensorflow.keras.Sequential()
                model.add(input_layer)
                for dense_layer in dense_layers:
                    model.add(dense_layer)
                model.add(output_layer)
                model.compile(optimizer=random.choice(self.optimizer_array), loss=random.choice(self.loss_array), metrics=['accuracy'])
                model_array.append(model)

But this generates an error:

    ValueError: Input 0 of layer dense_1 is incompatible with the layer: expected axis -1 of input shape to have value 3 but received input with shape (None, 1)

Can anyone explain to me how I'm not connecting these layers together properly? From what I can tell (and test) it seems to be working, but when I launch it within the context of this API I'm trying to use it throws this error. Did I just not test expansively enough?

  • Does this answer your question? [Input 0 of layer sequential is incompatible with the layer: expected axis -1 of input shape to have value 784](https://stackoverflow.com/questions/66190989/input-0-of-layer-sequential-is-incompatible-with-the-layer-expected-axis-1-of) – Galletti_Lance Nov 03 '21 at 01:32
  • @Galletti_Lance I don't think so. I'm testing it out now though to be sure, thanks for your reply. – wookieluvr49 Nov 03 '21 at 03:32
  • @Galletti_Lance Can you try explaining that post more? I don't really understand it – wookieluvr49 Nov 03 '21 at 04:53

1 Answers1

1
  1. Softmax output will return 13 positive values whose sum to 1. Here you seem to want 'independent' probabilities between 0 and 1 for all your values so you should go with activation='sigmoid' that does what you want.
  2. Your models have 2 InputLayer (input_layer and input_for_dense_layer) that probably causes the confusion in shape expectation for the first layer.
Valentin Goldité
  • 1,040
  • 4
  • 13
  • Thanks for your reply. So, input_for_dense_layer was added because it was complaining that it couldn't take the output from one layer, and transfer it to the next layer. Do you want me to comment out input_for_dense_layer so I can get you the specific error? – wookieluvr49 Nov 03 '21 at 03:25
  • Yes why not .... – Valentin Goldité Nov 03 '21 at 10:41