Questions tagged [optuna]

Optuna is Hyper-parameter Optimization Framework for Python (versions 2.7 and 3.*) that helps to find the best parameters for Machine Learning models via checking various combinations of parameters' values. Site: https://optuna.org

Optuna is Hyper-parameter Optimization Framework for Python (versions 2.7 and 3.*) that helps to find the best parameters for Machine Learning models via checking various combinations of parameters' values.

Resources:

191 questions
2
votes
0 answers

Hyperparameter optimization in pytorch (currently with sklearn GridSearchCV)

I use this(link) pytorch tutorial and wish to add the grid search functionality in it ,sklearn.model_selection.GridSearchCV (link), in order to optimize the hyper parameters. I struggle in understanding what X and Y in gs.fit(x,y) should be; per the…
BFH
  • 23
  • 1
  • 7
1
vote
1 answer

Reset pruned trials in Optuna study?

If I have a study where all the pruned trials needs to be reset for some reason, is there a way to do this? Maybe something that might work: Creating a copy of the current study where pruned trials have their state reset. Thank you for any…
gameveloster
  • 901
  • 1
  • 6
  • 18
1
vote
1 answer

How to record each fold`s validation loss during cross-validation in Optuna?

I am using Toshihiko Yanase`s code for doing cross validation on my hyperparameter optimizer with Optuna. Here is the code that I am using: def objective(trial, train_loader, valid_loader): # Remove the following line. # train_loader,…
Gabi Gubu
  • 25
  • 3
1
vote
1 answer

How can I change an Optuna's trial result?

I'm using optuna on a complex ML algorithm, where each trial takes around 3/4 days. After a couple of trials, I noticed that the values that I was returning to optuna were incorrect, but I do have the correct results on another file (saving as a…
jorgue
  • 13
  • 3
1
vote
0 answers

running multiple ray Tuning in parallel using a search algorithm

I want to queue 200+ tuning jobs to my ray cluster, they each need to be guided by a search algorithm, as my actual objective function has 40+ parameters. I can do this for a single job like this: import ray from ray import tune from ray.tune import…
XiB
  • 620
  • 6
  • 19
1
vote
1 answer

How to make Optuna get replicable results?

I'm using optuna to tune LGBM. Random seeds had been set but each time Optuna got different set of best params. Here's my optuna code: def get_hpo_params(opt_X_train, opt_X_val, opt_y_train, opt_y_val, n_trials=180, cat_features=""): def…
Cherry Wu
  • 3,844
  • 9
  • 43
  • 63
1
vote
1 answer

How to tell optuna which parameters to optimise

import optuna import xgboost as xgb import sklearn.datasets from sklearn.model_selection import cross_val_score def objective(trial: optuna.Trial, X, y) -> float: params = { "colsample_bytree": trial.suggest_float('colsample_bytree',…
gabboshow
  • 5,359
  • 12
  • 48
  • 98
1
vote
1 answer

Optuna, change axes ratio in plots

I've been running some optimization with optuna, and I'd like to produce plots with the same scale on both axes, but so far I was unable to find out how. study = optuna.create_study(study_name=study_name, …
Carlo
  • 1,321
  • 12
  • 37
1
vote
1 answer

Is there a way for Optuna `suggest_categorical`to return multiple choices from list?

I am using Optuna for hyperparametrization of my model. And I have a field where I want to test multiple combinations from a list. For example: I have ["lisa","adam","test"] and I want suggest_categorical to return not just one, but a random…
urboom
  • 37
  • 6
1
vote
0 answers

What might cause a GPU to hang/get stuck when using multiprocessing in Python?

TL;DR: using PyTorch with Optuna with multiprocessing done with Queue(), a GPU (out of 4) can hang. Probably not a deadlock. Any ideas? Normal version: I am using PyTorch in combination with Optuna (a hyperparameter optimization framework; basically…
Anon Name
  • 33
  • 3
1
vote
1 answer

Store user attributes in Optuna Sweeper plugin for Hydra

How can I store additional information in an optuna trial when using it via the Hydra sweep plugin? My use case is as follows: I want to optimize a bunch of hyperparameters. I am storing all reproducibility information of all experiments (i.e.,…
Michel Kok
  • 354
  • 1
  • 10
1
vote
1 answer

Optuna hyperparameter optimization of LightGBM model

I'm using Optuna to tune the hyperparameters of a LightGBM model. I suggested values for a few hyperparameters to optimize (using trail.suggest_int / trial.suggest_float / trial.suggest_loguniform). There are also some hyperparameters for which I…
1
vote
1 answer

keras MLP halts mid-training without killing the run

I am using optuna v.2.10.0 on a keras v.2.8.0 MLP using Python v.3.9.12 for macOS (tensorflow-metal v.0.4.0), using GPU and at a random point during training of a trial the progress just stops, the GPU usage drops to nothing, but the program doesn't…
Jamie
  • 13
  • 6
1
vote
2 answers

Tunning (Optuna) RandomForest Model but Give "Returned Nan" Result When Using class_weight Parameter

I want to tune my RF model using Optuna. The dataset is imbalanced. So, I used class_weight parameter to solve this. This is my RF Model code: model = RandomForestClassifier( n_estimators = trial.suggest_int("clw_n_estimators", 10,…
ferdianm10
  • 55
  • 4
1
vote
0 answers

Optuna, ValueError: Return value must be float-castable. Got 'None'

I am using Optuna for hyperparameter search with Hydra framework, but it throws me this error: values = [float(ret.return_value)] ValueError: Return value must be float-castable. Got 'None'. What does it mean? How do I make return value…
Kahn8sa
  • 11
  • 1