Questions tagged [optuna]

Optuna is Hyper-parameter Optimization Framework for Python (versions 2.7 and 3.*) that helps to find the best parameters for Machine Learning models via checking various combinations of parameters' values. Site: https://optuna.org

Optuna is Hyper-parameter Optimization Framework for Python (versions 2.7 and 3.*) that helps to find the best parameters for Machine Learning models via checking various combinations of parameters' values.

Resources:

191 questions
4
votes
1 answer

Optuna Pytorch: returned value from the objective function cannot be cast to float

def autotune(trial): cfg= { 'device' : "cuda" if torch.cuda.is_available() else "cpu", # 'train_batch_size' : 64, # 'test_batch_size' : 1000, # 'n_epochs' : 1, # 'seed' : 0, # …
Tonz
  • 177
  • 1
  • 2
  • 11
4
votes
1 answer

Using optuna LightGBMTunerCV as starting point for further search with optuna

I'm trying to use LightGBM for a regression problem (mean absolute error/L1 - or similar like Huber or pseud-Huber - loss) and I primarily want to tune my hyperparameters. LightGBMTunerCV in optuna offers a nice starting point, but after that I'd…
Björn
  • 644
  • 10
  • 23
3
votes
2 answers

use trial.suggest_int to pick values from given list in optuna, just like trial.suggest_categorical does

I'm working with optuna for hyperparameter tuning of ML models in Python. While defining the objective function for tuning a Deep Learning model I tried to define a list of choices from which the trail.suggest_int can pick up values from. For…
Bhanu Chander
  • 390
  • 1
  • 6
  • 16
3
votes
0 answers

How to run parallel optuna optimizations programmatically?

I am using python 3.10. I tried threading module. But it does not improve overall calculation time because of python gil afaik. I tried n_jobs parameter of study.optimize but it didn't work for me. The optimization routine uses a unique instance of…
DrP3x
  • 53
  • 7
3
votes
1 answer

Custom eval metric using early stopping in LGBM (Sklearn API) and Optuna

Questions: First question is probably extremely stupid but I will ask anyway: Is the pruning and the early stopping the same in this example below? Or is it two separate separate options controlling two separate processes? I got an imbalanced…
Kjetil Haukås
  • 374
  • 1
  • 11
3
votes
5 answers

pytorch lightning "got an unexpected keyword argument 'weights_summary'"

I have been dealing an error when trying to learn Google "temporal fusion transformer" algorithm in anaconda spyder 5.1.5. Guys, it is very important for me to solve this error. Somebody should say something. I will be very glad. The example which i…
osmgnr
  • 31
  • 1
  • 3
3
votes
2 answers

Has anyone implemented a optuna Hyperparameter optimization for a Pytorch LSTM?

I am trying to implemented a Optuna Hyperparameter optimization for a Pytorch LSTM. But I do not know how to define my model correctly. When I just use nn.linear erverything works fine but when I use nn.LSTMCell I get the following…
Joe_Joe
  • 33
  • 2
3
votes
2 answers

Optuna score vs Cross_val_score?

A accuracy score from optuna and a score in cross_val_score were different. Why does it occuer and which score should I choose? I used the hyperparameters that I got in optuna in cross_val_score. def objective_lgb(trial): num_leaves =…
Ten
  • 31
  • 1
3
votes
1 answer

Supressing optunas cv_agg's binary_logloss output

if I tune a model with the LightGBMTunerCV I always get this massive result of the cv_agg's binary_logloss. If I do this with a bigger dataset, this (unnecessary) io slows down the performance of the optimization process. Here is the code: from…
3
votes
2 answers

Function to generate optuna grids provided an sklearn pipeline

I am using sklearn along with optuna for HPO. I would like to create a custom function that would take an sklearn pipeline as input and return optuna-specifc grids. Returning sklearn specific param grids (i.e. dictionaries) seems to be more…
takmers
  • 71
  • 1
  • 5
3
votes
1 answer

How to set hidden_layer_sizes in sklearn MLPRegressor using optuna trial

I would like to use [OPTUNA][1] with sklearn [MLPRegressor][1] model. For almost all hyperparameters it is quite straightforward how to set OPTUNA for them. For example, to set the learning rate: learning_rate_init =…
user88484
  • 1,249
  • 1
  • 13
  • 34
3
votes
1 answer

Suggesting Multivariate Ratios with Bounds in Optuna

I'm currently working with Optuna and I'm trying to suggest ratios that are bounded by multiple variables. Specifically, I'm dealing with ratios X_1, X_2, ..., X_k that are bounded by ∑X_i = 1 and 0 <= X_i <= 1 for all i. Unfortunately, Optuna does…
3
votes
1 answer

OptKeras (Keras Optuna Wrapper) - use optkeras inside my own class, AttributeError: type object 'FrozenTrial' has no attribute '_field_types'

I wrote a simple Keras code, in which I use CNN for fashion mnist dataset. Everything works great. I implemented my own class and classification is OK. However, I wanted to use Optuna, as OptKeras (Optuna wrapper for Keras), you can see an example…
yak
  • 3,770
  • 19
  • 60
  • 111
3
votes
2 answers

Is it a flaw that Optuna examples return the evaluation metric of the test set?

I am using Optuna for parameter optimization for some models. In almost all the examples the objective function returns a evaluation metric on the TEST set, and tries to minimize/maximize this. I feel like this is a flaw in the examples since…
brian
  • 31
  • 2
3
votes
1 answer

How to fix error "'_BaseUniformDistribution' object has no attribute 'to_internal_repr'" - strange behaviour in optuna

I am trying to use optuna lib in Python to optimise parameters for recommender systems' models. Those models are custom and look like standard fit-predict sklearn models (with methods get/set params). What I do: simple objective function that…
roseaysina
  • 135
  • 1
  • 9
1
2
3
12 13