Questions tagged [keras-tuner]

166 questions
3
votes
1 answer

Keras Hypermodel - build with default parameters

I implemented Hyperparameter Tuning with KerasTuner. I would like to have the option to skip the Hyperparameter Tuning and use the default values instead. It looks like this now (which builds the model with the best parameters after the…
Hauke
  • 53
  • 4
3
votes
2 answers

How to solve AttributeError: module 'tensorflow._api.v2.distribute' has no attribute 'TPUStrategy'

I am working with Keras and Tensorflow in order to create a predictor model. I have only CPU device and I can't execute my code. In the code only use Keras and Kerastuner to search hyperparameters. This is the error trace: File "file.py", line 537,…
3
votes
3 answers

In using Keras Tuner with Tensorflow 2 I am getting an error : division by zero

I am experimenting with kerastuner. Here is my code with a reproducible example: import kerastuner as kt from kerastuner.tuners.bayesian import BayesianOptimization (x_train, y_train), (x_test, y_test) =…
user8270077
  • 4,621
  • 17
  • 75
  • 140
2
votes
0 answers

keras_tuner.RandomSearch. ValueError: Received incompatible tensor with shape (352,) when attempting to restore variable with shape (64,)

I am trying to build an autoencoder using HyperModel to later perform hyperparameter tuning using RandomSearch. After training the autoencoder with my data (2 matrices of shape(600, 411001) where each row is a sample) I get the error message…
2
votes
1 answer

Save keras-tuner results as pandas Dataframe

Is there a possibility of saving the results of keras-tuner as Dataframe? All I can find are printing functions like result_summary(), but I cannot access the printed content. The example below both prints print None, while the result_summary()…
Enes
  • 33
  • 1
  • 4
2
votes
1 answer

Keras Tuner: select number of units conditional on number of layers

I am using Keras Tuner to tune the hyperparameters of my neural network. I want to search the optimal number of hidden layers and the optimal number of units in each layer. To avoid overparametrizing the model, I want to impose the following…
NC520
  • 346
  • 3
  • 13
2
votes
1 answer

Hyperparameters tuning with keras tuner for classification problem

I an trying implement both classification problem and the regression problem with Keras tuner. here is my code for the regression problem: def build_model(hp): model = keras.Sequential() for i in range(hp.Int('num_layers', 2,…
Rezuana Haque
  • 608
  • 4
  • 14
2
votes
1 answer

What's the difference with using get_best_hyperparameters() to generate model and get_best_models()?

After searching hyperparameters, I tried two way to get best model. One way is using tuner.get_best_hyperparameters() to generate the model as shown in code snippet "A". Another is using tuner.get_best_models() directly as shown in code snippet…
2
votes
0 answers

Why the tuned number of hidden layers (show 3) is not the same as the units (show 4 units) when tuning an ANN model using KerasTurner?

I am currently using the KerasTuner to tune my Artificial Neural Network (ANN) deep learning model for a binary classification project (tabular dataset ). Below is my function to build the model: def build_model(hp): # Create a Sequential…
2
votes
0 answers

Keras-Tuner default value meaning

Really simple question, when using keras-tuner and searching for the best set of hyperparameters there are a range of types to search for, so for simplicity let's say I'm using hp.Int(), now I can set a minimum, a maximum and a default value for…
DPM
  • 845
  • 7
  • 33
2
votes
0 answers

Keras tuner takes long time between trials

I have recently noticed wierd behavior of Keras Tuner. The elapsed time is around 6 houres but the algorythm is running around 20 houres. Also I have noticed that there is really long time between trials. I am using code bellow: tuner_neurons =…
2
votes
0 answers

Why default values for patience parameters in EarlyStopping and ReduceLROnPlateau are so different?

There is a patience parameter both in EarlyStopping and ReduceLROnPlateau in Keras. However, the patience parameter for EarlyStopping is 0 meanwhile 10 for ReduceLROnPlateau. Assuming they both monitor val_loss by default, how ReduceLROnPlateau…
2
votes
1 answer

How to fix "pop from empty list" error while using Keras tuner search method with TPU in google colab?

I previously was able to run the search method of keras tuner on my model with GPU runtime of Google colab. But when I switched to the TPU runtime, I get the following error. I haven't been able to come to the conclusion of how to access a google…
2
votes
2 answers

How to pass fix hyperparameters as variables for Keras-Tuner?

I'd like to do a hyperparameter-tuning on a Keras model with Keras tuner. import tensorflow as tf from tensorflow import keras import keras_tuner as kt def model_builder(hp): model = keras.Sequential() …
Fredrik
  • 411
  • 1
  • 3
  • 14
2
votes
1 answer

units present for non existing layers in keras tuner results for sequential model

I am using keras tuner for hyperparameter tuning of ANN model. I am changing the no of layers in the model between 2 and 5, and nodes between 10 and 30. I am using random search model of keras tuner to select the best performing models. When I…
srinivas
  • 301
  • 1
  • 9
1
2
3
10 11