Questions tagged [optuna]

Optuna is Hyper-parameter Optimization Framework for Python (versions 2.7 and 3.*) that helps to find the best parameters for Machine Learning models via checking various combinations of parameters' values. Site: https://optuna.org

Optuna is Hyper-parameter Optimization Framework for Python (versions 2.7 and 3.*) that helps to find the best parameters for Machine Learning models via checking various combinations of parameters' values.

Resources:

191 questions
1
vote
1 answer

Can I optimize distributively multiple models at the same time?

I understand that I can do distributed optimization with Optuna. However, I don't know if I can do it with multiple models at the same time? For example: optuna create-study --study-name "distributed-example1" --storage…
1
vote
1 answer

How to tune conditional objective function using optuna or hyperopt

I tried to use optuna to tune hyperparameters. But my objective function is conditional which creates issues in getting optimal parameters. i want to get cwc only if the condtion is met otherwise continue trial for next hyperparameters. But i guess…
0
votes
0 answers

suggest_int or suggest_categorical for binary variables?

I'm trying to suggest a binary variable in my Optuna trial. I haven't found a direct trial.suggest_binary, but I guess I could use both trial.suggest_int('var', 0, 1) or trial.suggest_categorical('var', [True, False]). Is there any advantage to use…
Aurelie Navir
  • 938
  • 2
  • 7
  • 17
0
votes
0 answers

How can I cross-validate pretrained BERT model using Pytorch and Optuna?

I am using a pre-trained BERT model to classify ASR-generated transcript segments and am currently using Optuna to identify the optimal hyperparameters. I wish to modify this code to use cross-validation whilst using Optuna to find the best…
csStudent2102
  • 75
  • 1
  • 7
0
votes
0 answers

Error while using optuna for hyper parameter optimization with huggingface trainer

Information The problem arises in chapter: Making Transformers Efficient in Production Describe the bug while training I am getting proper F1 score of 0.755940 image while finding best fit value of alpha and temperature value for NER task f1 score…
0
votes
0 answers

Optuna Coding in Python

I'm trying to code optuna hyperparameter tunning as a method and then I will use it for different machine learning algorithms like decision tree, random forest, xgboost, logistic etc. However, code gives me error like int is not…
0
votes
0 answers

Why Optuna can't reproduce my LGBM result in the for loop?

I have a simple training task that required me to rolling training my model, which means that I need to use previous 12 months data to predict the next month label, and I will rerun the model every month. I wrote something like: params_dict =…
Leoix
  • 1
  • 1
0
votes
1 answer

Error in search space while performing optuna hyperparameter optimization

the code most likely containing the bug-> import optuna def objective(trial): criterion = trial.suggest_categorical("criterion", ["gini", "entropy"]), min_samples_split = trial.suggest_float('min_samples_split', 0, 1), …
0
votes
0 answers

When trying to reproduce optuna optimization results the precision score is different than optimization with XGBoost

After hyperparameter optimization yields best parameters when i try and reproduce the result there is a significant difference in results based on the same dataset and seed value: My optimization code is def objective(trial): param = { …
azuric
  • 2,679
  • 7
  • 29
  • 44
0
votes
0 answers

Discrepancy between Optuna's AUC ROC and scikit-learn's AUC ROC for binary classification problem

I'm working on a binary classification problem where I have ~30 features of enzyme substrates to predict EC1 and EC2. I'm using xgboost with optuna for hyperparameter tuning. However, I'm observing a discrepancy between the AUC ROC values reported…
0
votes
0 answers

How OPTUNA optimize a list hyperparameter

I'm trying to optimize my hyperparameters with Optuna, but I can't figure out how to tell Optuna thath the ENCODER NEURONS must be a list where length is not fixed (it's also an hyperparameter) and the values are integers (the first must be…
Miorri
  • 1
  • 2
0
votes
0 answers

Save the model with the best optuna trial

I have trained a model using optuna (PyTorch) where number of trials was 5. This is what my output looks like : Now, the 4th trial is the best trial. I dont just want the best parameter values, I want to save the model with these values for lr and…
0
votes
1 answer

Optuna - Epoch vs Trial

I am trying to train a model using optuna for hyperparameter optimization. Now in my train function, I am passing all all the train images in the dataset to that model in batches of 4. Say I have 20 images so that means 20/4 = 5 batches of my…
0
votes
0 answers

ValueError when tuning LGBM Regressor with Optuna based on MAE and RMSE

I am trying to tune the LGBM regressor based on RMSE and MAE. From what I understand this should be done by returning the metrics from the objective function for the optuna study. I read this and this article which describe something similar.…
0
votes
1 answer

Optuna 3.2 + Optuna Dashboard 0.10.3 - study page going blank (content dissappears)

I had an issue with Optuna 3.2 + Optuna Dashboard 0.10.3 (using Firefox as my browser - the issue would probably be the same in other browsers). While I opened the study page, something appeared briefly, but then it disappeared, and only a blank…
eXPRESS
  • 425
  • 2
  • 4
  • 19