2

I'm using Keras similarly to the end-to-end example here- https://keras-team.github.io/keras-tuner/tutorials/subclass-tuner/

Some of the models that are generated are much larger than others, and the larger ones lead to an OOM error that stops the tuning.

I'm aware that I can set a max_model_size (https://github.com/keras-team/keras-tuner/issues/175). But instead of limiting the model size, is it possible for the tuner to select a maximum batch size depending on the size of the model, or to skip batch sizes that are too large for memory?

JJ101
  • 171
  • 5

1 Answers1

0

I am not sure if this works for batch size but generally you can define a parent (hyper-)parameter. Only under selected conditions your 'child' parameter is then defined.

hp_number_of_layers = hp.Int('number_of_layers', min_value = 4, max_value = 10, step = 2, default=6)
hp_batch_size = hp.Int('batch_size', min_value = 4, max_value = 8, step = 4, default=4, parent_name='number_of_layers', parent_values=[6,8])
MarioT
  • 15
  • 3