AdaBoost is a meta machine learning algorithm. It performs several rounds of training in which the best weak classifiers are selected. At the end of each round, the still misclassified training samples are given a higher weight, resulting in more focus on these samples during the next round of selecting a weak classifier.
Questions tagged [adaboost]
255 questions
1
vote
0 answers
What should I do in order to use LSTM as a weak learner for adaboostregressor
The specific implementation of base_estimator is not mentioned in the sklearn documentation. I want to use LSTM as base_estimator of adaboostregressor, but the way in the picture doesn't work, how can I design LSTM as base_estimator? Thank you…

Wanan
- 11
- 2
1
vote
0 answers
Unexpected poor performance of AdaBoost compared to Random Forest
I am working on a lithology identification project similar to the one described here.
So far the Random Forest method has yielded satisfying results. I decided to compare its performance with that of other algorithms, namely AdaBoost and Support…

Sheldon
- 4,084
- 3
- 20
- 41
1
vote
2 answers
Configuration of GridSearchCV for AdaBoost and its base learner
I'm running grid search on AdaBoost with DecisionTreeClassifier as its base learner to get the best parameters for AdaBoost and DecisionTree.
The search on a dataset (130000, 22) has been running for 18 hours so I'm wondering if it's just another…

Edison
- 11,881
- 5
- 42
- 50
1
vote
1 answer
How to use the 'adaboost' method to Build Classification Trees wthin the Caret and fastAdaboost Packages in R
Issue
I'm attempting to use the 'adaboost' method within the Caret and fastAdaboost packages. My objective is to build a classification tree using `machine learning techniques in R for an upcoming project at university and I am following this…

Alice Hobbs
- 1,021
- 1
- 15
- 31
1
vote
1 answer
Why does AdaBoost or GradientBoosting ensemble with a single estimator give different values than the single estimator?
I'm curious why a single-estimator Adaboost "ensemble", a single-estimator Gradient Boosted "ensemble" and a single decision tree give different values.
The code below compares three models, all using the same base estimator (regression tree with…

David R
- 994
- 1
- 11
- 27
1
vote
0 answers
R: adaboost (JOUSBoost package) giving 'Not compatible with requested type'
I have the classic titanic data. Here is the description of the cleaned data.
> str(titanic)
'data.frame': 887 obs. of 7 variables:
$ Survived : Factor w/ 2 levels "No","Yes": 1 2 2 2 1 1 1 1 2 2 ...
$ Pclass : int…

ycenycute
- 688
- 4
- 10
- 20
1
vote
1 answer
(matlab) how to load adaboost model so that coder compatible?
I save my adaboot model as .mat file. I use this to load the model:
load('adaboost_23.mat')
But matlab coder cannot generate C/C++ code. So I change to:
coder.load('adaboost_23.mat')
Still not working:
How should I do it? Data type is…

desperateLord
- 303
- 2
- 9
1
vote
1 answer
AdaBoost algorithm Hyperparameter Tuning MLR
Im trying to tune the hyperparameters of the AdaBoost algorithm. The goal is to train a model with a multiclass classification variable as target. Im working with the MLR package in R. However, MLR does only give letters (see below) so Im not sure…

nelepi
- 27
- 3
1
vote
2 answers
Adaboost Sklearn Feature Importance NaN
I am building an Ada boost model with Sklearn. Last year I made the same model with the same data, and I was able to access the feature importances. This year when I build the model with the same data the feature importance attribute contains NaNs.I…

Ruth C
- 11
- 3
1
vote
1 answer
select Important-feature with Adaboost in python
I want to select Important feature with adaboost. I found 'yellowbrick.model_selection' is very good and fast for this work. and I used this code. but it has problem.
"ValueError: could not broadcast input array from shape (260200) into shape (1)
My…

Amene Vatanparast
- 21
- 5
1
vote
1 answer
Significantly different values for MAPE and MAE on the same dataset
I'm currently running a regression with various forecasting methods on the same dataset.
For DT, the MAE is a little higher than for the AB model, while the MAPE is significantly higher for the AB model. How is this possible? I realize that a lower…

0009
- 49
- 5
1
vote
0 answers
adaboost regressor taking very long time
I'm trying to do adaboost for my time series forecasting with following code
df_store = pd.read_pickle('CA_2.pkl')
snap_feature = ['snap_'+store.split('_')[0]]
selected_columns = [column for column in df_store.columns if '7' not in column and…

Saranraj K
- 412
- 1
- 7
- 19
1
vote
0 answers
Using Keras for sklearn AdaBoost with a custom Y parameter
I am trying to apply AdaBoost to a Keras model. The thing is that I have to use a custom loss function (unscaled deviance) which works well when I am using Keras into the RandomizedSearchCV of sklearn, but when I try to use AdaBoostRegressor i get…

Lucien Ledune
- 11
- 1
1
vote
1 answer
How to input parameters in GridSearchCV for AdaBoostClassifier. ERROR: Invalid parameter learning_rate for estimatoR
I am trying to tune parameters for my model using GridSearchCV. However, I keep getting the same error telling me that the parameter grid that I am passing contains parameters that are invalid. for example, it keeps telling me invalid parameter…

Furaha Damién
- 15
- 6
1
vote
0 answers
how to Joint feature via online boosting
I would like to combine two features using online boosting.
I read more papers that explain online boosting and joint features using boosting, papers are:
Identification of a specific person using color, height, and gait features for a person…

Algabri
- 185
- 1
- 2
- 12