After looking at doc and tutorial, it seems to me it is very easy to define a hyper parameter for your model. This includes the code to construct it out of layers, as well as compile related ones such as learning rate. What I am looking for (also) is a way to run hyper-param search on non model related parameters. Here are some examples:
- data augmentation. If you build this as part of tf dataset pipeline. e.g. amount of random translation
- over/undersampling. This is often used to handle unbalanced classes, and one way of doing this is tf.data.Dataset.sample_from_datasets. The "weights" argument to this method are hyperparameters.
- num of epochs. Maybe I am missing this one, but it should be considered in keras_tuner in the most straightforward way. A workaround is to use schedule callbacks and achieve this in compile
Is the tuner library framework missing on all these? These seem to be common things you like to tune.