Questions tagged [optuna]

Optuna is Hyper-parameter Optimization Framework for Python (versions 2.7 and 3.*) that helps to find the best parameters for Machine Learning models via checking various combinations of parameters' values. Site: https://optuna.org

Optuna is Hyper-parameter Optimization Framework for Python (versions 2.7 and 3.*) that helps to find the best parameters for Machine Learning models via checking various combinations of parameters' values.

Resources:

191 questions
3
votes
1 answer

how to fit learning rate with pruning?

The background for the question is optimizing hyper params of neural network training by running study.optimize() with default pruning enabled and learning rate as parameter to optimize (this question can be generalized to other hyperparams). high…
2
votes
1 answer

Resume Optuna study from most recent checkpoints

Is there a way to be able to pause/kill the optuna study, then resume it either by running the incomplete trials from the beginning, or resuming the incomplete trials from the latest checkpoint? study =…
gameveloster
  • 901
  • 1
  • 6
  • 18
2
votes
0 answers

optuna threaded parallelization n_jobs=-1 not using all resources

Simple question, I can find others with similar issues but no real solutions. TLDR: n_jobs=-1 enables simultaneous execution of trials, but no extra cores are utilized on my CPU and total runtime is slightly longer than with n_jobs=1. Why is it that…
XiB
  • 620
  • 6
  • 19
2
votes
1 answer

Optuna: What is the cause of "The reported value is ignored because this `step` {} is already reported."

I started using Optuna and on the second or third round my console is spammed by messages like this: UserWarning: The reported value is ignored because this `step` 438 is already reported. There were about 2400 of these, starting at "step 1" and…
David R
  • 994
  • 1
  • 11
  • 27
2
votes
1 answer

Optuna pruning for validation loss

I introduced the following lines in my deep learning project in order to early stop when the validation loss has not improved for 10 epochs: if best_valid_loss is None or valid_loss < best_valid_loss: best_valid_loss = valid_loss counter = 0…
Panertoĸ
  • 23
  • 4
2
votes
2 answers

Multiple trainings / multiple NN initialisations per Hyperparamter validation with Optuna and pruning

I am just doing my first ML-with-optuna project. My question is how can I probe one set of hyperparamters for multiple NN initialization, where each run within one trial is still subject to pruning? I am assuming that the initialization has quite…
Osmosis D. Jones
  • 130
  • 1
  • 12
2
votes
2 answers

How to manually terminate an Optuna trial due to an invalid parameter subspace?

When tuning parameters in Optuna, I have an invalid subspace in my space of possible parameters. In my particular case, two of the parameters that I'm tuning can cause extremely long trials (that I want to avoid) if they are both close to zero (<…
DaBigJoe
  • 29
  • 2
  • 6
2
votes
1 answer

Understanding Intermediate Values and Pruning in Optuna

I am just curious for more information on what an intermediate step actually is and how to use pruning if you're using a different ml library that isn't in the tutorial section eg) XGB, Pytorch etc. For example: X, y =…
2
votes
3 answers

What is the canonical way to set fixed parameters and retrieve them after a study is complete?

Optuna allows users to search a parameter space using the suggest_ API. This is easy and clever enough. However, there are some parameters I would like to remain fixed. For example, with Scikit-Learn's DBSCAN implementation: Search using suggest_…
2
votes
4 answers

suggest_int() missing 1 required positional argument: 'high' error on Optuna

I have the following code of Optuna to do the hyperparameter tunning for a Xgboost classifier. import optuna from optuna import Trial, visualization from optuna.samplers import TPESampler from xgboost import XGBClassifier def objective(trial:…
Yifan Lyu
  • 43
  • 1
  • 6
2
votes
1 answer

Limit max number of parallel processes in Optuna

How to limit the max number of parallel processes when running hyper-parameter search in Optuna?
Gal Hyams
  • 336
  • 1
  • 5
  • 15
2
votes
4 answers

Turning off Warning in Optuna Training

I fully realized that I will likely be embarassed for missing something obvious, but this has me stumped. I am tuning a LGBM model using Optuna, and my notebook gets flooded with warning messages, how can I suppress them leaving errors (and ideally…
Ncosgove
  • 55
  • 1
  • 5
2
votes
1 answer

Best parameters of an Optuna multi-objective optimization

When performing a single-objective optimization with Optuna, the best parameters of the study are accessible using: import optuna def objective(trial): x = trial.suggest_uniform('x', -10, 10) return (x - 2) ** 2 study =…
Dr. Paprika
  • 122
  • 3
  • 14
2
votes
1 answer

What happens when I add/remove parameters dynamically during an Optuna study?

Optuna's FAQ has a clear answer when it comes to dynamically adjusting the range of parameter during a study: it poses no problem since each sampler is defined individually. But what about adding and/or removing parameters? Is Optuna able to handle…
gosuto
  • 5,422
  • 6
  • 36
  • 57
2
votes
0 answers

Optimizing filter sizes of CNN with Optuna

I have created a CNN for classification of three classes based on input images of size 39 x 39. I'm optimizing the parameters of the network using Optuna. For Optuna I'm defining the following parameters to optimize: num_blocks =…
machinery
  • 5,972
  • 12
  • 67
  • 118
1 2
3
12 13