I want to use adaboost with several base estimators for regression in scikit-learning, but I don't find any class that can do it. Is there any way to do this things except changing source code?
Asked
Active
Viewed 2.0k times
2 Answers
4
You can read this page on sklearn: adaboost. I personally like stacking XGboost,GBM, RandomForest and ExtraTree as base models and stacking them to get better auc score.

Andreas Hsieh
- 2,080
- 1
- 10
- 8
-
If you want to stack or blend multiple base estimators, you can check out this module on github: [stacked generalization](https://github.com/andreashsieh/stacked_generalization). This module was developed by dustinstansbury and I made a patch for it to make the code working better. – Andreas Hsieh May 08 '16 at 02:55
-
I am using adaboost for regression, so this module may not help me – modkzs May 10 '16 at 02:56
-
2@AndreasHsieh: how do you stack different algorithms? – Alex Sep 24 '16 at 22:41
2
I may not understand your question, but if all you want to do is have an Adaboost regressor with a base estimator of your choosing, you can just create the base model and feed it in as the base_estimator
parameter of AdaboostRegressor()
.
A simple example using adaboost with an support vector machine regressor is:
from sklearn.svm import SVR
from sklearn.ensemble import AdaBoostRegressor
my_base_model= SVR()
my_ensemble = AdaBoostRegressor(base_estimator=my_base_model)
You would want to put parameters into each these calls to customize the base estimator and the ensemble, but the basic skeleton is as above.

David R
- 994
- 1
- 11
- 27