0

Is there any way I can save the full Keras model with best parameters obtained using Gridsearch.

I have the following Keras model:

def create_model(init_mode='uniform'):
    n_x_new=train_selected_x.shape[1]

    model = Sequential()
    model.add(Dense(n_x_new, input_dim=n_x_new, kernel_initializer=init_mode, activation='sigmoid'))
    model.add(Dense(10, kernel_initializer=init_mode, activation='sigmoid'))
    model.add(Dropout(0.8))

    model.add(Dense(1, kernel_initializer=init_mode, activation='sigmoid'))


    model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])

    return model

seed = 7
np.random.seed(seed)


model = KerasClassifier(build_fn=create_model, epochs=30, batch_size=400, verbose=1)

init_mode = ['uniform', 'lecun_uniform', 'normal', 'zero', 'glorot_normal', 'glorot_uniform', 'he_normal', 'he_uniform']
param_grid = dict(init_mode=init_mode)
#cv = PredefinedSplit(test_fold=my_test_fold)
grid = GridSearchCV(estimator=model, param_grid=param_grid,scoring='roc_auc',cv = PredefinedSplit(test_fold=my_test_fold), n_jobs=1)
grid_result = grid.fit(np.concatenate((train_selected_x, test_selected_x), axis=0), np.concatenate((train_selected_y, test_selected_y), axis=0))



print("Best: %f using %s" % (grid_result.best_score_, grid_result.best_params_))
means = grid_result.cv_results_['mean_test_score']
stds = grid_result.cv_results_['std_test_score']
params = grid_result.cv_results_['params']
for mean, stdev, param in zip(means, stds, params):
    print("%f (%f) with: %r" % (mean, stdev, param))

I came to know that I can use callback and checkpoint method, but I don't know where to put the required code for this method in my original code.

The code I came across while researching is as follows.

filepath="weights.best.hdf5"
    checkpoint = ModelCheckpoint(filepath, monitor='val_acc', verbose=1, save_best_only=True, mode='max')
    callbacks_list = [checkpoint]
Julien Rousé
  • 1,115
  • 1
  • 15
  • 30
Stupid420
  • 1,347
  • 3
  • 19
  • 44

1 Answers1

0

After looking around, it seems it has to be as simple as

classifier = KerasClassifier(build_fn=DNN, nb_epoch=32, batch_size=8, callbacks=[your_callback], verbose=1)

But that does't seem to work too. A possible workaround comes from an answer given for - Can I send callbacks to a KerasClassifier?, which should help.

This comes as consequence to using different general tools that weren't especially thought to be used together with all the possible configurations.

Also, you can refer to this ticket - How to pass callbacks to scikit_learn wrappers (e.g. KerasClassifier) #4278

Hope it helps!

Vivek
  • 322
  • 1
  • 13