Hyperopt is a Python library for serial and parallel optimization over awkward search spaces, which may include real-valued, discrete, and conditional dimensions
Questions tagged [hyperopt]
128 questions
1
vote
0 answers
PySpark fails evaluating keras neural networks within hyperopt SparkTrials
I have some strange bugg that I have been stuck on for a few days and can't get it solved.
My goal is to evaluate several keras nns within hyperopt. For boosting the evaluation process I use SparkTrails (also see…

LockeDerBoss
- 11
- 1
1
vote
0 answers
Nested parameters in hyperopt 0.2.4
I wanted to do a search of parameters in which one of them is dependent on the other, very similar to what is described on this StackOverflow question, but when I run I get the issue below:
TypeError: len of pyll.Apply either undefined or…

João Areias
- 1,192
- 11
- 41
1
vote
1 answer
When we specify "--algorithms=sgd" in vw-hyperopt, does it run with adaptive, normalised and invariant updates?
the confusion is because when we specify --sgd in vw command line, it runs classic sgd, without adaptive, normalised and invariant updates. So, when we specify algorithm as sgd in vw-hyperopt, does it run as classic or with special updates? Is it…

sameershah141
- 338
- 4
- 7
1
vote
0 answers
Error in Bayesian optimization using Hyperopt
I am getting value error in the following code when I am using Hyperopt for Bayesian Optimization :
from hpsklearn import HyperoptEstimator, any_classifier
from sklearn.datasets import fetch_openml
from hyperopt import tpe
import numpy as np
#…

Shaurya Sheth
- 185
- 2
- 11
1
vote
1 answer
How could I use hyperopt to optimize a vector that sums up to 1?
I am using hyperopt to search for the hyperparameters of the algorithm. There are three numbers to be optimized: w1, w2 and w3. The three numbers should satisfy the condition that s1+w2+w3=1.
I defined search space like this:
space = {
'w1':…

CoinCheung
- 85
- 10
1
vote
0 answers
How to perform nested Cross Validation (LightGBM Regression) with Bayesian Hyperparameter optimization and TimeSeriesSplit?
I want to do predictions with a Regression model.
I try to optimize my LightGBM model for the best hyperparameters while aiming for the lowest generalization RMSE score without overfitting/underfitting.
All examples I've seen use Classifications and…

Vega
- 2,661
- 5
- 24
- 49
1
vote
0 answers
read hyperopt parameters from json
I want to read hyperopt parameters from a JSON file.
My JSON file would be like:
[
{
"id": "121",
"model": [
{
"model_name": "power",
"estimator_type": [
{
…

George
- 5,808
- 15
- 83
- 160
1
vote
4 answers
AttributeError: 'int' object has no attribute 'randint' in Hyperopt
I'm trying to adjust a random forest as a Hyperopt, but this error is occurring and I can't solve it

Felipe Oliveira
- 75
- 1
- 4
1
vote
1 answer
Why hyperopt is giving the best loss Nan while operating in Random Forest?
I am solving a Kaggle Problem: https://www.kaggle.com/c/forest-cover-type-prediction/data
I used hyperopt to find optimal hyperparameter for Random Forest. But I am stuck here, as for almost most of the iteration it is giving best loss: Nan.
My…
user9614033
1
vote
2 answers
XGBoost using Hyperopt. Facing issues while Hyper-Parameter Tuning
I am trying to Hyper-Parameter Tune XGBoostClassifier using Hyperopt. But I am facing a error. Please find below the code that I am using and the error as well:-
Step_1: Objective Function
import csv
from hyperopt import STATUS_OK
from timeit import…

Aditya Das
- 35
- 6
1
vote
1 answer
Error when running hyperopt fmin function (TypeError: cannot convert dictionary update sequence element #0 to a sequence)
So I am using hyperopt, the fmin function to optimize hyperparameters. However, for some reason I am getting this error:
TypeError: cannot convert dictionary update sequence element #0 to a sequence
The code I have is like this:
fn =…

Thomas Kok
- 81
- 3
1
vote
1 answer
hyperopt - is it possible to get the current value of its search space?
I am wondering if there is a way to access the current value chosen by hyperopt for parameters? I would like to use its selected value in a learning rate callback function for xgboost.
from hyperopt import hp
param = {'eta' : hp.uniform('eta', 0.01,…

John Stud
- 1,506
- 23
- 46
1
vote
1 answer
only integer scalar arrays can be converted to a scalar index error in Hyperopt
I am trying to optimize a set of parameters using the Hyperopt library. I implemented the code following this tutorial. Everything works fine as long as I put max_evals to less than 30 runs. When I put max_evals to 30 I get the bellow error at the…

Suleka_28
- 2,761
- 4
- 27
- 43
1
vote
1 answer
how to extract selected hyperparameter from hyperopt hp.choice?
I'm using hyperopt to find the optimal hyperparameters to a catboost regressor.
I'm following this guide.
the relevant part is:
ctb_reg_params = {
'learning_rate': hp.choice('learning_rate', np.arange(0.05, 0.31,…

ihadanny
- 4,377
- 7
- 45
- 76
1
vote
0 answers
Ray Tune ; combine population based training schedule with Hyperopt
Are Population Based Training (PBT) and HyperOpt Search combinable ?
The AsyncHyperBandScheduler is used in the Hyperopt Example of ray.tune
Here config set some parameters for the run() function
config = {
"num_samples": 10 if args.smoke_test…

Alexander Vocaet
- 188
- 1
- 10