Questions tagged [optuna]

Optuna is Hyper-parameter Optimization Framework for Python (versions 2.7 and 3.*) that helps to find the best parameters for Machine Learning models via checking various combinations of parameters' values. Site: https://optuna.org

Optuna is Hyper-parameter Optimization Framework for Python (versions 2.7 and 3.*) that helps to find the best parameters for Machine Learning models via checking various combinations of parameters' values.

Resources:

191 questions
1
vote
0 answers

How do we optimize XGBoost hyperparameters using optuna without using the booster object?

I am currently working on using XGBoost for prediction. I wish to know which group of hyperparameters would provide the best results. I have used optuna for the same but the prediction results seem to be out of line. def…
Nilanjan
  • 15
  • 5
1
vote
1 answer

Why optuna stuck at trial 2(trial_id=3) after it has calculated all hyperparameters?

I am using optuna to tune xgboost model's hyperparameters. I find it stuck at trial 2 (trial_id=3) for a long time(244 minutes). But When I look at the SQLite database which records the trial data, I find all the trial 2 (trial_id=3) hyperparameters…
1
vote
1 answer

HyperOpt multi metric evalution

Does anyone know if it is possible to somehow calculate metrics other than accuracy in HyperOpt? I would also like it to display me F1, precision, recall. Is there any option to do it? If so could someone please explain it to me. def…
1
vote
2 answers

A question about the "n_trials" in optuna

I'm trying to use optuna to tune hyperparameters of xgboost, but because of memory restriction, I can't set the attribute n_trials too high otherwise it would report MemoryError, so I'm wondering that if I set n_trials=5 and run the program for 4…
1
vote
0 answers

How can I integrate Optuna with Deepspeech training?

I'm trying to integrate Optuna with DeepSpeech in order to optimise some of its hyperparameters. I'm sticking to learning rate for now, just to get a feel for how Optuna works, but I've hit a roadblock and need some help. I have a function hps_train…
1
vote
1 answer

How to set n_trials for multiple processes when using parallelization?

When I execute code without parallel computation, n_trials in the optimize function means how many trials the program runs. When executed via parallel computation (following the tutorial here via launching it again in another console), it does…
Aurelie Navir
  • 938
  • 2
  • 7
  • 17
1
vote
2 answers

How to sample an Optuna parameter from a weighted categorical distribution?

I have a very complex tree structured search space. At the top level I need to make a categorical choice - which subspace of parameters to explore. As a simple example, you can imagine that I need to decide between using a linear regression, an SVM,…
iga
  • 3,571
  • 1
  • 12
  • 22
1
vote
0 answers

memory leak that persists after colab cell executes

I'm encountering a subtle memory leak, and unable to determine the source using tracemalloc. I run the following code in google colab, which is meant to optimize hyperparameters for a custom ppo agent. Also, the speed at which the leak happens…
watch-this
  • 1
  • 4
  • 20
1
vote
1 answer

Correct way to optimise hyperparameters based on accuracy

I am using optunta + catboost to optimise and train some boosted trees. I would like to know the correct way to optimise my hyperparameters to maximise accuracy (rather than minimise log-loss). At the moment my code is: def objective(trial): …
Isaac
  • 61
  • 2
1
vote
0 answers

optuna.integration.lightGBM custom optimization metric

I am trying to optimize a lightGBM model using optuna. Reading the docs I noticed that there are two approaches that can be used, as mentioned here: LightGBM Tuner: New Optuna Integration for Hyperparameter Optimization. The first approach uses the…
Mattia Surricchio
  • 1,362
  • 2
  • 21
  • 49
1
vote
1 answer

How to set a minimum number of epoch in Optuna SuccessiveHalvingPruner()?

I'm using Optuna 2.5 to optimize a couple of hyperparameters on a tf.keras CNN model. I want to use pruning so that the optimization skips the less promising corners of the hyperparameters space. I'm using something like this: study0 =…
1
vote
1 answer

Specify fixed parameters and parameters to be search in optuna (lightgbm)

I just found Optuna and it seems they are integrated with lightGBM, but I struggle to see where I can fix parameters, e.g scoring="auc" and where I can define a gridspace to search, e.g num_leaves=[1,2,5,10]. Using…
CutePoison
  • 4,679
  • 5
  • 28
  • 63
1
vote
3 answers

Optuna - Memory Issues

I am trying to free memory in between Optuna optimization runs. I am using python 3.8 and the latest version of Optuna. What happens is I run the commands: optuna.create_study(), then I call optuna.optimize(...) in a loop, with a new objective…
user14572001
  • 11
  • 1
  • 2
1
vote
0 answers

Jointly optimizing autoencoder and fully connected network for classification

I have a large set of unlabeled data and a smaller set of labeled data. Thus, I would like to first train a variational autoencoder on the unlabeled data and then use the encoder for classification of three classes (with a fully connected layer…
machinery
  • 5,972
  • 12
  • 67
  • 118
1
vote
0 answers

CoNLL files in hyperparameter tuning using Optuna

I've been trying to work out how to optimize the hyperparameters in a Bi-LSTM model for PoS and dependency parsing (https://github.com/datquocnguyen/jPTDP). The model takes CoNLL-U files as input and I am clueless as to how I can go about using them…