Questions tagged [adaboost]

AdaBoost is a meta machine learning algorithm. It performs several rounds of training in which the best weak classifiers are selected. At the end of each round, the still misclassified training samples are given a higher weight, resulting in more focus on these samples during the next round of selecting a weak classifier.

255 questions
6
votes
0 answers

Ratio of positive to negative data to use when training a cascade classifier (opencv)

So I'm using OpenCV's LBP detector. The shapes I'm detecting are all roughly circular (differing mostly in aspect ratio), with some wide changes in brightness/contrast, and a little bit of occlusion. OpenCV's guide on how to train the detector is…
user3765410
  • 101
  • 6
6
votes
2 answers

Combining Weak Learners into a Strong Classifier

How do I combine few weak learners into a strong classifier? I know the formula, but the problem is that in every paper about AdaBoost that I've read there are only formulas without any example. I mean - I got weak learners and their weights, so I…
5
votes
1 answer

Using scikit-learn's MLPClassifier in AdaBoostClassifier

For a binary classification problem I want to use the MLPClassifier as the base estimator in the AdaBoostClassifier. However, this does not work because MLPClassifier does not implement sample_weight, which is required for AdaBoostClassifier (see…
5
votes
2 answers

How to use adaboost with different base estimator in scikit-learn?

I want to use adaboost with several base estimators for regression in scikit-learning, but I don't find any class that can do it. Is there any way to do this things except changing source code?
modkzs
  • 1,369
  • 4
  • 13
  • 17
5
votes
2 answers

AdaBoost ML algorithm python implementation

Is there anyone that has some ideas on how to implement the AdaBoost (Boostexter) algorithm in python? Cheers!
Timos K.
  • 79
  • 1
  • 6
5
votes
3 answers

Adaboost and forward stagewise additive modeling

Although it wasn't originally conceived this way, the standard Adaboost algorithm is equivalent to conducting a forward stagewise additive model estimation using an exponential loss function. That is, given some weak classifiers c1,...,cM, and…
Charles Pehlivanian
  • 2,083
  • 17
  • 25
5
votes
1 answer

Adabag package in R

I am trying to perform classification using R's adabag package. The following call works perfectly with R's ada package's ada() function. model<-ada(factor(label)~., data=trainingdata) But when the same training data set is used in the following…
Shahzad
  • 1,999
  • 6
  • 35
  • 44
5
votes
1 answer

Explanation of cascade.xml in a haar classifier

It would be best if someone could explain the numbers/values in the cascade.xml entirely. Example in: <_> 3 -8.8384145498275757e-001 <_> …
Yaobin Then
  • 2,662
  • 1
  • 34
  • 54
5
votes
1 answer

questions on implementing AdaBoost algorithm

I am trying to implement AdaBoost algorithm, and have two questions. 1) At each iteration, the training data has to be re-sampled in accordance with a probability distribution. Should the size of re-sampled data set be the same as the one of…
user288609
  • 12,465
  • 26
  • 85
  • 127
5
votes
1 answer

How Viola Jones With AdaBoost Algorithm Work in Face Detection?

I've read a lot about Viola Jones method but i still not understand about "Weak Classifier", "Strong Classifier", "Sub Window" in Rectangle features, what is definition about them. And what about "threshold"? How i can know the threshold value? Can…
5
votes
1 answer

Basic understanding of the Adaboost algorithm

I'm a machine learning newbie trying to understand how Adaboost works. I've read many articles explaining how Adaboost makes use of set of weak *classifiers* to create a strong classifier. However, I seem to have problem understanding the statement…
garak
  • 4,713
  • 9
  • 39
  • 56
4
votes
0 answers

Should sklearn.ensemble.AdaBoostClassifier be relying on duplicate estimators (weak learners)?

While analyzing the errors (misclassifcations) of an sklearn.ensemble.AdaBoostClassifier using DecisionTreeClassifier stubs as the base estimators, I have found that there a large number of duplicated estimators in the ensemble. Is it typical to…
CK215
  • 175
  • 1
  • 2
  • 7
4
votes
1 answer

scikit adaboost feature_importance_

how exactly does the adaboost algorithm implemented in python assigns feature importances to each feature? I am using it for feature selection and my model performs better on applying feature selection based on the values of feature_importance_ .
4
votes
1 answer

Use AdaBoost(Boosting) with Accord.Net

I am trying to use adaboost (or boosting) in Accord.Net. I tried a version of the example given by https://github.com/accord-net/framework/wiki/Classification for decision trees and it works well with the following code: '' Creates a matrix from the…
4
votes
1 answer

adaboost update weights beta value

Viola-Jones face detection used the adaboost method to train strong classifier. I am confused with the beta param update policy: Why choose beta value like this? The purpose of setting the variable beta is to increase the weight of the Weights. How…
tidy
  • 4,747
  • 9
  • 49
  • 89
1
2
3
16 17