While using the Keras_tuner library I got the error in the title. I have no clue how to fix it and could not find any solutions for some time.
I am using Tensorflow BatchDatasets and passing them for the Hyperparameter tuning.
batch_size = 5
train_data = keras.utils.timeseries_dataset_from_array(
x_train_scaled,
y_train_scaled,
1,
batch_size=batch_size,
shuffle=False
)
val_data = keras.utils.timeseries_dataset_from_array(
x_val_scaled,
y_val_scaled,
1,
batch_size=batch_size,
shuffle=False
)
test_data = keras.utils.timeseries_dataset_from_array(
x_test_scaled,
y_test_scaled,
1,
batch_size=batch_size,
shuffle=False
)
# design network
def build_model(hp):
model = keras.Sequential()
model.add(keras.layers.LSTM(hp.Choice('units', [8, 16]), activation='relu', return_sequences=True, input_shape=(x_train.shape[1], x_train.shape[2])))
model.add(keras.layers.LSTM(hp.Choice('units', [8, 16]), activation='relu', return_sequences=True))
model.add(keras.layers.LSTM(hp.Choice('units', [8, 16]), activation='relu'))
model.add(keras.layers.Dense(1))
model.compile(loss='mae', optimizer='adam')
return model
tuner = keras_tuner.RandomSearch(
build_model,
objective='val_loss',
max_trials=100)
tuner.search(train_data, epochs=100, shuffle=False, validation_data=val_data)
best_model = tuner.get_best_models()[0]
The problem is not with the input shapes or anything, as the same model works if I wasn't using the keras tuner.
The full error is:
ValueError: The graph of the iterator is different from the graph the dataset: Tensor("PrefetchDataset:0", shape=(), dtype=variant) was created in. If you are using the Estimator API, make sure that no part of the dataset returned by the "input_fn" function is defined outside the "input_fn" function. Otherwise, make sure that the dataset is created in the same graph as the iterator.
And the Line that triggers it is:
tuner.search(train_data, epochs=100, shuffle=False, validation_data=val_data)