How to optimally select parameters for the HistGradientBoostingClassifier() model for big data (more than 10 million rows). If I use such code:
model_hgbc = HistGradientBoostingClassifier()
param_hgbc = {'learning_rate' : [0.001,0.003,0.008,0.05,0.09],
'max_iter' : [100, 230, 450, 1000],
'max_depth' :[5,6,7,10],
'random_state': [42] }
hgbc_grid = GridSearchCV(estimator = model_hgbc,
param_grid = param_hgbc,
scoring ='roc_auc_ovo', cv = 10, n_jobs= -1).fit(X_train, y_train)
Then I can't do anything, all my RAM is clogged with this program and the computer stops responding.
What should I do in this situation?