0

is there any way for generated layers to make that previous layer has always more neurons than next layer? I have example code:

from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Flatten, InputLayer
from kerastuner.tuners import RandomSearch
from tensorflow.keras.optimizers import Adam

def build_model(hp):
    model = Sequential()
    model.add(InputLayer(input_shape=e_net_out_shape))
    model.add(Flatten())

    for i in range(hp.Int('num_layers', 0, 2)):
        model.add(Dense(units=hp.Int('units_' + str(i), min_value=32, max_value=928,  step=64), activation='relu'))

    model.add(Dense(39, activation='softmax'))
    model.compile(optimizer=Adam(learning_rate=0.01), loss='categorical_crossentropy', metrics=['acc'])
    return model

and during search using Random Search algorithm next layer ex. units_2 has more neurons than previous one ex. units_1. So I'd like to constrain max_value of next layers to current value of previous ones.

albert828
  • 50
  • 6
  • please provide what `hp` is. If possible, then please replace it with something that is easy to reproduce for others, primarily because `hp` doesnt play any role in the problem you are trying to solve. – Akshay Sehgal Feb 09 '21 at 22:00
  • hp is a standard Keras Tuner argument for sampling parameters. [link](https://keras-team.github.io/keras-tuner/) – albert828 Feb 09 '21 at 22:05
  • Ah makes sense. Please provide the imports to the above code as well to make it easy to reproduce. https://stackoverflow.com/help/minimal-reproducible-example – Akshay Sehgal Feb 09 '21 at 22:07

0 Answers0