0

I am new to using Hyperas and I am facing a syntax problem. I want to optimize the parameters of my LSTM in keras, for that I am using hyperas to loop on parameters like dropout rate or numbers of neurons as well as to choose between different achitectures. So my code is the following:

def building_model(x_train, y_train, x_test, y_test):
    model = Sequential()
    model.add(LSTM({{choice([100, 200, 300, 400, 500])}}, input_shape=(48,  6), return_sequences=True))
    model.add({{choice([Dropout({{uniform(0, 1)}}), BatchNormalization()])}})
    model.add(LSTM({{choice([100, 200, 300, 400, 500])}}))
    model.add({{choice([Dropout({{uniform(0, 1)}}), BatchNormalization()])}})
    if {{choice(['three', 'four'])}} == 'four':
        model.add({{choice([Dense({{choice([100, 200, 300, 400, 500])}}), LSTM({{choice([100, 200, 300, 400, 500])}})])}})
        model.add({{choice([Dropout({{uniform(0, 1)}}), BatchNormalization()])}})
    model.add(Dense({{choice([100, 200, 300, 400, 500])}}))
    model.add(Dense(24, activation="linear"))
    model.compile(loss="mse", optimizer={{choice(['rmsprop', 'adam', 'sgd', 'nadam'])}}, metrics=['accuracy']) 
    result = model.fit(x_train, y_train, epochs=epochs, batch_size={{choice([12, 24, 64, 128])}}, validation_split=0.1, verbose=2, callbacks=[early_stopping, checkpointer], save_dir="saved_models")
    validation_acc = np.amax(result.history['val_acc']) 
    print('Best validation acc of epoch:', validation_acc)

return {'accuracy': validation_acc, 'status': STATUS_OK, 'model': model}  

And I am using the optim.minimize function to run it:

best_run, best_model = optim.minimize(model=building_model,
                                      data=data,
                                      algo=tpe.suggest,
                                      max_evals=10,
                                      trials=Trials())

But then I am facing the issue when Hyperas is building the model by itself, it makes the following code:

model = Sequential()
    model.add(LSTM(space['LSTM'], input_shape=(48, 6), return_sequences=True))
    model.add(Dropout(space['Dropout']))
    model.add(LSTM(space['LSTM_1'], return_sequences=True))
    model.add(space['add']), BatchNormalization()])}})
    model.add(LSTM(space['LSTM_2']))
    model.add(space['add_1']), BatchNormalization()])}})
    if space['add_2'] == 'four':
        model.add(space['add_3']), LSTM(space['LSTM_3'])])}})
        model.add(space['add_4']), BatchNormalization()])}})
    model.add(Dense(space['LSTM_4']))
    model.add(Dense(24, activation="linear"))

with syntax error on line 5, 7, 9 and 10 Any idea on how to change that?

ASUCB
  • 51
  • 6

1 Answers1

0

I think the correct syntax for dropout layer is

model.add(Dropout({{uniform(0, 1)}}))
model.add(BatchNormalization())

Note that layer type like LSTM or Dropout is at the outside.

stormzhou
  • 316
  • 2
  • 5