0

My trainX is (3350, 1, 8) and my TrainY is (3350, 2). I am getting some errors, but I don't understand what's the problem. I am getting this error when tuning the hyperparameters using the LSTM layer.

Output exceeds the size limit. Open the full output data in a text editor
--------------------------------------------------------------------------- InvalidArgumentError                      Traceback (most recent call last) /tmp/ipykernel_4525/3330873115.py in <module>
----> 1 bayesian_opt_tuner.search(trainX, trainY,epochs=100,
      2      #validation_data=(X_test, y_test)
      3      validation_split=0.2,verbose=1)

~/anaconda3/lib/python3.9/site-packages/keras_tuner/engine/base_tuner.py in search(self, *fit_args, **fit_kwargs)
    181 
    182             self.on_trial_begin(trial)
--> 183             results = self.run_trial(trial, *fit_args, **fit_kwargs)
    184             # `results` is None indicates user updated oracle in `run_trial()`.
    185             if result is None:

~/anaconda3/lib/python3.9/site-packages/keras_tuner/engine/tuner.py in run_trial(self, trial, *args, **kwargs)
    293             callbacks.append(model_checkpoint)
    294             copied_kwargs["callbacks"] = callbacks
--> 295             obj_value = self._build_and_fit_model(trial, *args, **copied_kwargs)
    296 
    297             histories.append(obj_value)

~/anaconda3/lib/python3.9/site-packages/keras_tuner/engine/tuner.py in
_build_and_fit_model(self, trial, *args, **kwargs)
    220         hp = trial.hyperparameters
    221         model = self._try_build(hp)
--> 222         results = self.hypermodel.fit(hp, model, *args, **kwargs) ...
    File "/home/vareeshadki/anaconda3/lib/python3.9/site-packages/keras/backend.py", line 5238, in sparse_categorical_crossentropy
      res = cf.nn.sparse_softmax_cross_entropy_with_logits( Node: 'sparse_categorical_crossentropy/SparseSoftmaxCrossEntropyWithLogits/SparseSoftmaxCrossEntropyWithLogits' logits and labels must have the same first dimension, got logits shape [32,2] and labels shape [64]      [[{{node sparse_categorical_crossentropy/SparseSoftmaxCrossEntropyWithLogits/SparseSoftmaxCrossEntropyWithLogits}}]] [Op:__inference_train_function_1763748]

My code:

import os
from kerastuner.tuners import BayesianOptimization

def build_model(hp):
    model = tf.keras.Sequential()
    hp_units = hp.Int('units', min_value=32, max_value=512, step=10)
    model.add(LSTM(hp_units,activation='relu'))
    model.add(Dense(2))
    hp_learning_rate = hp.Choice('learning_rate', values=[1e-2, 1e-3, 1e-4])
    model.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=hp_learning_rate),
                loss=tf.keras.losses.SparseCategoricalCrossentropy(),
                metrics=['accuracy'])
    return model

bayesian_opt_tuner = BayesianOptimization(
    build_model,
    objective='mse',
    max_trials=3,
    executions_per_trial=2,
    directory=os.path.normpath('C:/keras_tuning'),
    project_name='kerastuner_bayesian_poc',
    overwrite=True)

bayesian_opt_tuner.search(trainX, trainY,epochs=100,
     #validation_data=(X_test, y_test)
     validation_split=0.2,verbose=1)
Flavia Giammarino
  • 7,987
  • 11
  • 30
  • 40

0 Answers0