This tag is for questions on the process of running an exhaustive search over specified parameter values for an estimator using the class GridSearchCV from Python's scikit-learn library.
Questions tagged [gridsearchcv]
597 questions
4
votes
1 answer
AttributeError: 'str' object has no attribute 'parameters' due to new version of sklearn
I am doing topic modeling using sklearn. While trying to get the log-likelihood from Grid Search output, I am getting the below error:
AttributeError: 'str' object has no attribute 'parameters'
I think I understand the issue which is:…

Piyush Ghasiya
- 515
- 7
- 25
4
votes
0 answers
Is there anyway to use fit_generator() method with KerasRegressor wrapper?
I am trying to tune hyper-parameters for my LSTM model using GridSearchCV but have used TimeseriesGenerator from keras.preprocessing.sequence. How do I modify the KerasRegressor wrapper to accommodate fit_generator() instead of the fit() method?
def…

Vinay Bharath
- 63
- 2
4
votes
0 answers
How to use GridSearchCV on LSTM model?
I am not sure how to use GridSearchCV in order to optimize the LSTM model. I went through this tutorial "https://machinelearningmastery.com/grid-search-hyperparameters-deep-learning-models-python-keras/"; however, they did things separately. I am…

Luis Torres
- 41
- 1
- 3
4
votes
1 answer
Preprocessing on GridsearchCV
I'm using GridsearchCV for tuning hyperparameters and now I want to do a min-max Normalization(StandardScaler()) in training and validating step.But I think I cannot do this.
The question is :
If I apply preprocess step on whole training set and…

Puntawat Ponglertnapakorn
- 83
- 1
- 8
4
votes
1 answer
How to get the selected features in GridSearchCV in sklearn in python
I am using recurive feature elimination with cross validation (rfecv) as the feature selection technique with GridSearchCV.
My code is as follows.
X = df[my_features_all]
y = df['gold_standard']
x_train, x_test, y_train, y_test =…

EmJ
- 4,398
- 9
- 44
- 105
4
votes
0 answers
Tuning ratio of Random Under Sampling with Grid Search in a unbalanced data set
As suggested on the title I would like to do a grid search on the ratio of my random under sampler. I would like to try the ratios 10, 15 and 20 where ratio = 10 is number of resampled majority class / number of minority class.
Example : if the…

Tammem Sa
- 51
- 3
4
votes
1 answer
How to set own scoring with GridSearchCV from sklearn for regression?
I used to use GridSearchCV(...scoring="accuracy"...) for classification model. and now I am about to use GridSearchCV for the regression model and set scoring with own error function.
Example code:
def rmse(predict, actual):
predict =…

will Park
- 73
- 1
- 8
3
votes
1 answer
Is there a way to get all computed coefficients in a GridSearchCV?
I am trying different ML models, all using a pipeline which includes a transformer and an algorithm, 'nested' in a GridSearchCV to find the best hyperparameters.
When running Ridge, Lasso and ElasticNet regressions, I would like to store all the…

Greg Mansio
- 109
- 5
3
votes
1 answer
How to save the best estimator in GridSearchCV?
When faced with a large dataset, I need to spend a day using GridSearchCV() to train an SVM with the best parameters. How can I save the best estimator so that I can use this trained estimator directly when I start my computer next time?

Lancdorr
- 335
- 4
- 14
3
votes
1 answer
Using GridSearchCV best_params_ gives poor results
I'm trying to tune hyperparameters for KNN on a quite small datasets ( Kaggle Leaf which has around 990 lines ):
def knnTuning(self, x_train, t_train):
params = {
'n_neighbors': [1, 2, 3, 4, 5, 7, 9],
'weights': ['uniform',…

Timothee W
- 149
- 7
3
votes
1 answer
How to compare baseline and GridSearchCV results fair?
I am a bit confusing with comparing best GridSearchCV model and baseline.
For example, we have classification problem.
As a baseline, we'll fit a model with default settings (let it be logistic regression):
from sklearn.linear_model import…

Lisbeth
- 141
- 1
- 6
3
votes
1 answer
Multiple values for a single parameter in the mlflow run command
I just started learning mlflow and wanted to know how to pass multiple values to each parameter in the mlflow run command.
The objective is to pass a dictionary to GridSearchCV as a param_grid to perform cross validation.
In my main code, I…

Downforu
- 317
- 5
- 13
3
votes
0 answers
Correct way to do oversampling, gridsearch and cross validation together?
For a given dataset with input features X_all and prediction labels y_all for a binary classification problem, I want to oversample my data using SMOTE, find best parameters for my learning algorithm using GridSearchCV and then observe the results…

Vasu Mistry
- 781
- 2
- 6
- 18
3
votes
1 answer
GridSearchCV and Google colab: n_jobs=-1 does not work
The problem is the following: when running these lines of code
grid = GridSearchCV(estimator=model, param_grid=param_grid, n_jobs=-1)
grid_result = grid.fit(X, y)
on Google Colab, it returns the error
PicklingError: Could not pickle the task to…

Alessandro
- 764
- 3
- 8
- 22
3
votes
1 answer
Check the list of available parameters with `estimator.get_params().keys()`
When I try to run a RandomForestClassifier with Pipeline and param_grid:
pipeline = Pipeline([("scaler" , StandardScaler()),
("rf",RandomForestClassifier())])
from sklearn.model_selection import GridSearchCV
param_grid = {
…

rnv86
- 790
- 4
- 10
- 22