AdaBoost is a meta machine learning algorithm. It performs several rounds of training in which the best weak classifiers are selected. At the end of each round, the still misclassified training samples are given a higher weight, resulting in more focus on these samples during the next round of selecting a weak classifier.
Questions tagged [adaboost]
255 questions
0
votes
1 answer
R: How to read ada's AdaBoost tree rules in 'if-else' conditions?
Does anyone know how to transform the AdaBoost trees (results in R) into if-else conditions?
I have used the caret package in R, along with the train function and method="ada" to obtain some predictions on my dataset.
Afterwards I used the function…

lorelai
- 147
- 1
- 2
- 9
0
votes
1 answer
Training sets for AdaBoost algorithm
How do you find the negative and positive training data sets of Haar features for the AdaBoost algorithm? So say you have a certain type of blob that you want to locate in an image and there are several of them in your entire array - how do you go…

palau1
- 1
- 1
0
votes
0 answers
How to incorporate pre-trained perceptrons into AdaBoostClassifier?
I want to use sklearn.ensamble's AdaBoostClassifier for a simple binary classification task. How can I use multiple, pre-fit perceptrons as the weak classifiers in an AdaBoostClassifier?
i.e.
from sklearn.ensemble import AdaBoostClassifier
from…

Rob Irwin
- 123
- 11
0
votes
0 answers
Adaboost Influence Trimming Takes Longer to Train
The OpenCV documentation states that influence trimming can be used "to reduce the computation time for boosted models with substantially losing accuracy". By default, the weight_trim_rate parameter is 0.95. After disabling influence training by…

Radford Parker
- 731
- 4
- 14
0
votes
1 answer
Is apache hama suitable for implementing adaboost alghoritm?
I'm interested in implementing adaboost algorithm in hadoop environment. I've made research that mapreduce could be slow due to lack of native iterative support. Apache hama is interesting alternative but is there any feature of apache hama which…

caruso
- 191
- 10
0
votes
1 answer
The predict method shows standardized probability?
I'm using the AdaBoostClassifier in Scikit-learn and always get an average probability of 0.5 regardless of how unbalanced the training sets are. The class predictions (predict_) seems to give correct estimates, but these aren't reflected in the…

Ola Gustafsson
- 31
- 3
0
votes
1 answer
serialize adaboost classifier scikit-learn
I'm trying to use scikit-learn AdaBoostClassifier, and i'm trying to serialize the output classifier using cPickle to save it to database or a file, but i got out of memory error, and when i used marshal, it gave me the unmarshable object. So, i'm…
0
votes
1 answer
Adaboost algorithm and its usage in face detection
I am trying to understand Adaboost algorithm but i have some troubles. After reading about Adaboost i realized that it is a classification algorithm(somehow like neural network). But i could not know how the weak classifiers are chosen (i think…

Hani
- 1,517
- 5
- 20
- 28
0
votes
1 answer
How does the decision of a feature is a feature of our object?
Can anybody explain how does OpenCV make a decision about a feature of an object, when doing train_cascade???

Sodeq
- 88
- 1
- 7
0
votes
1 answer
Is the threshold of Haar-feature is calculated by the only way, Viola-Jones described in their paper?
I am implementing Viola-Jones face detection algorithm and bit confused about haar-feature threshold. I am calculating the threshold of haar-feature using follow. steps:
a) Calculate the haar-feature value in all positive(face) images respective to…

user2766019
- 577
- 4
- 7
- 20
0
votes
1 answer
Obtain instance weights from AdaBoostM1 in Weka
AdaBoostM1 is a boosting algorithm implemented in Weka. A key component of this algorithm is the reweighting of "hard to classify" instances after each iteration. I want to obtain the weight of each instance that AdaBoostM1 uses for each…

Walter
- 2,811
- 2
- 21
- 23
0
votes
2 answers
OpenCV Haartraining does not finish forever
This is the first time I use haartraining of opencv.
Just for practice, I used 35 positive images and 45 negative images.
But when I try to train from data, It does not finish forever,
Even when parameters are extremely adjusted.
(min hit rate =…

winnerrrr
- 669
- 1
- 7
- 10
0
votes
1 answer
R gbm package: Training Error and Adaboost exponential loss function
I am using gbm package in R for binary classification. I am using adaboost exponential loss function for the algorithm. I have two questions:
If I want to see the training error, should I just look at this? (Suppose my model object is called fit,…

Boxuan
- 4,937
- 6
- 37
- 73
0
votes
0 answers
Display Haar features from a trained AdaBoost classfier
Is there a way to display visually which Haar features are used at every stage of the classifier?
I have recently trained a classfier to detect vehicles, using opencv_traincascade.exe.
For further analysis, I would like to see the features being…

Yaobin Then
- 2,662
- 1
- 34
- 54
0
votes
0 answers
XML File Creation and Viola-Jones
I am trying to create an XML file to detect hand gestures as good and robust as that of haarcascade_frontalface_alt.xml by Rainer Leinhart. So far, I have tried various training techniques such as the one posted in…

anna elisa sunga
- 9
- 1