1

I am training my LSTM model using keras tuner. I am getting an error

Expected the return value of HyperModel.fit() to be a single float when objective is left unspecified. Recevied return value: <tensorflow.python.keras.callbacks.History object at 0x7fa3e80be710> of type <class 'tensorflow.python.keras.callbacks.History'>.

I am not familiar with this error and also searched a bit. I am also not that much familiar with keras tuner.

My code is

            x_test = x_test[FeaturesSelected]
            totalColumns = len(FeaturesSelected)


                   
            
            
            
            callback = EarlyStopping(monitor='val_loss', patience=3)
            
            def build_model(hp):
                model=keras.Sequential()
                model.add(layers.Flatten(input_shape=(totalColumns,1)))
                for i in range(hp.Int('layers',2,8)):
                    model.add(layers.Dense(units=hp.Int('units_'+str(i),50,100,step = 10),
                                           activation=hp.Choice('act_'+str(i),['relu','sigmoid'])))
                model.add(layers.Dense(10,activation='softmax'))
                model.compile(optimizer =keras.optimizers.Adam(hp.Choice('learning_rate',values=[1e-2,1e-4])),
                                  loss='mean_absolute_error')
                return model
            tuner = RandomSearch(build_model,max_trials = 20,executions_per_trial=5)
            tuner.search(x_train,y_train,epochs = 10,validation_data=(x_test,y_test),callbacks=[callback])
            
            best_hps=tuner.get_best_hyperparameters(num_trials=1)[0]
            
            best_model = tuner.hypermodel.build(best_hps)
            
            history = best_model.fit(img_train, label_train, epochs=50, validation_split=0.2)
            val_acc_per_epoch = history.history['val_accuracy']
            best_epoch = val_acc_per_epoch.index(max(val_acc_per_epoch)) + 1
            
            hypermodel = tuner.hypermodel.build(best_hps)
            
            hypermodel.fit(x_train,y_train, epochs=best_epoch, validation_split=0.2)


            test_pred = hypermodel.predict(x_test)

1 Answers1

0

Randomsearch objects (and all other kerastuner.Tuner objects) utilize the History object returned by model.fit() to compare different trial results. If no objectives are ever specified to the RandomSearch object, it assumes you specified the metric of interest via the compile() 'metrics' parameter. However, if you don't specify a metric parameter - model.fit() will return a list of default metrics. To resolve this, you can specify the metrics for the model and the objective for the Randomsearch tuner, which would look like this:

model.compile(optimizer=keras.optimizers.Adam(hp.Choice('learning_rate',values=[1e-2,1e4])),loss='mean_absolute_error', metrics=['accuracy'])

And, to maximize accuracy result from the validation step:

RandomSearch(build_model,max_trials = 20,executions_per_trial=5, objective=kerastuner.Objective('val_accuracy', 'max'))

Or, to maximize accuracy during training:

RandomSearch(build_model,max_trials = 20,executions_per_trial=5, objective=kerastuner.Objective('accuracy', 'max'))

The objective parameter can be any single metric, or a list of metrics. If you provide a list of metrics, the tuner will reference the sum of all metrics provided.