AdaBoost is a meta machine learning algorithm. It performs several rounds of training in which the best weak classifiers are selected. At the end of each round, the still misclassified training samples are given a higher weight, resulting in more focus on these samples during the next round of selecting a weak classifier.
Questions tagged [adaboost]
255 questions
3
votes
1 answer
How to understand face detection xml
I have trained faces using opencv_trainedcascade.exe. I have a series of xml files for different stages.
For each xml file has internal nodes and leafVlaues and one of them is shown below.
…

batuman
- 7,066
- 26
- 107
- 229
3
votes
2 answers
how to specify the threshold of weak classifier for adaboost method of face detector
I have read Rapid Object Detection using a Boosted Cascade of Simple
Features. In part 3, it defines a weak classifier like this:
My question is: how to specify the threshold theta_j?
And for strong classfier, my question is like this:

tidy
- 4,747
- 9
- 49
- 89
3
votes
2 answers
What is the O() runtime complexity of AdaBoost?
I am using AdaBoost from scikit-learn using the typical DecisionTree weak learners. I would like to understand the runtime complexity in terms of data size N and number of weak learners T. I have searched for this info including in some of the…

cloudyBlues
- 75
- 1
- 4
3
votes
1 answer
GentleBoost n-ary classifier?
I'm looking for resources or implementation on n-ary Gentle Boost classifiers.
I've seen a number of Adaboost implementations, an implementation for GentleBoost in Matlab's Ensemble, but it always seems to be binary.
WEKA, too, has only an AdaBoost…

user961627
- 12,379
- 42
- 136
- 210
3
votes
2 answers
How to control depth of tree weaklearner with Matlab's fitensemble
I'm using Matlab's fitensemble function on a data with 8 features and 5000 samples.
With the following command I can train a model:
ada= fitensemble(datafeatures,dataclass,'AdaBoostM1',200,'tree');
My question: How can I create weak learners with…

Leeor
- 627
- 7
- 24
3
votes
1 answer
Effects of boosting with strong classifier
What is the effect of boosting with strong (instead of weak, error rate close to random) classifier? Could it be possible that a strong classifier perform better by itself than when this strong classifier is used in adaboost along with a bunch of…

BW0
- 757
- 8
- 14
3
votes
1 answer
Combining different machine learning algorithms with boosting in R
Is there package for R to boost different algorithms? For example Random Forest and neural networks. As I understand, packages ada and gbm can only boost Decision Trees.
Thank you.

ElectricHedgehog
- 173
- 8
3
votes
1 answer
Combining LBP and Adaboost
I want to train a dataset for face detection.
I'm gonna use LBP as weak classifiers and Adaboost for boosting them to one strong classifier.
I have positive and negative samples. Their size is 18x18 pixels. I'm dividing each picture to 9…

Can Vural
- 2,222
- 1
- 28
- 43
3
votes
1 answer
R tree-based methods like randomForest, adaboost: interpret result of same data with different format
Suppose my dataset is a 100 x 3 matrix filled with categorical variables. I would like to do binary classification on the response variable. Let's make up a dataset with following code:
set.seed(2013)
y <-…

Boxuan
- 4,937
- 6
- 37
- 73
3
votes
1 answer
Machine Learning classifier AdaBoost for C#
Is there a popular and stable library in C# for the AdaBoost algorithm?
Does such a library contain different flavors of boosting besides the classic AdaBoost (such as GentleBoost, LogitBoost, etc. )?

Leeor
- 627
- 7
- 24
3
votes
1 answer
Training a weak learner
I'm implementing an application using AdaBoost to classify if an elephant is Asian or African elephant. My input data is:
Elephant size: 235 Elephant weight: 3568 Sample weight: 0.1 Elephant type: Asian
Elephant size: 321 Elephant weight: 4789 …

gadzix90
- 744
- 2
- 13
- 28
3
votes
3 answers
weka AdaBoost does not improve results
In my bachelor thesis I am supposed to use AdaBoostM1 with a MultinomialNaiveBayes classifier on a text classification problem. The problem is that in most cases, the M1 is worse or equal to the MultinomialNaiveBayes without boosting.
I use the…

anti_gone
- 973
- 8
- 18
2
votes
1 answer
Updating NaiveBayes classifier in matlab
I'm writing a program for online versions of Bagging and AdaBoost algorithms and I'm using matlab's NaiveBayes classifier as the weak learner. Since as online learners they should get data one at a time, I have to update NaiveBayes classifier at…

Kourosh
- 274
- 1
- 6
2
votes
2 answers
Why's there a difference in prediction result between AdaBoost with n_estimators=1 that uses SVC as a base estimator, and just SVC
I am currently using daily financial data to fit my SVM and AdaBoost. To check my result, I tried AdaBoost with n_estimators=1 so that it would return same result as I just run a single SVM.
from sklearn.ensemble import AdaBoostClassifier
from…

배상일
- 87
- 1
- 9
2
votes
1 answer
How to get the coefficients of the model using sklearn's AdaBoostClassifier (with Logistic regression as the base estimator)
I have built a model using scikit-learn's AdaBoostClassifier with Logistic regression as the base estimator.
model = AdaBoostClassifier(base_estimator=linear_model.LogisticRegression()).fit(X_train, Y_train)
How do I obtain the coefficients of the…

Leockl
- 1,906
- 5
- 18
- 51