I am using the Keras Tuner for optimizing a CNN. I am able to get the optimal values for the filter and kernel size. But when I try to do the same for the pool size it gives me an error:
File "C:\Users\...", line 180, in pp
-optimal learning rate for the optimizer is {best_hps.get('learning_rate')}.""")
File "C:\Users\anaconda3\envs\venv\lib\site-packages\keras_tuner\engine\hyperparameters\hyperparameters.py", line 241, in get
raise KeyError(f"{name} does not exist.")
KeyError: 'pool does not exist.'
My code is as follows:
def build_model(hp):
model = Sequential()
model.add(Conv2D(filters=hp.Int('conv_1_filter', min_value=32, max_value=128, step=16),
kernel_size=hp.Choice('conv_1_kernel', values = [2, 3,5]),
strides=2,
input_shape=(num_rows, num_columns, num_channels),
activation='relu'))
model.add(MaxPooling2D(pool_size=hp.Choice('pool', values = [2, 3])))
model.add(Flatten())
model.add(Dense(num_labels, activation='softmax'))
# Display model architecture summary
# model.summary()
# Compile the model
model.compile(loss='categorical_crossentropy', metrics=['accuracy'], optimizer=keras.optimizers.Adam(hp.Choice('learning_rate', values=[1e-2, 1e-3])))
return model
from kerastuner import RandomSearch
tuner = RandomSearch(build_model,
objective='val_accuracy',
max_trials = 5)
tuner.search(X_train, Y_train,epochs=3,validation_data=(X_train, Y_train),verbose = 1)
# Get the optimal hyperparameters
best_hps=tuner.get_best_hyperparameters(num_trials=1)[0]
print(f"""The hyperparameter search is complete. The parameters are as follow:
-optimal filter size for the first layer is {best_hps.get('conv_1_filter')}
-optimal pool size is {best_hps.get('pool')}
-optimal kernel size for the first layer is {best_hps.get('conv_1_kernel')}
-optimal learning rate for the optimizer is {best_hps.get('learning_rate')}.""")
Please help me to debug this error.
Thanks!
SOLUTION:
I added overwrite=True within the keras_tuner.RandomSearch() arguments. It was taking older results and so was not showing the pool size.