AdaBoost is a meta machine learning algorithm. It performs several rounds of training in which the best weak classifiers are selected. At the end of each round, the still misclassified training samples are given a higher weight, resulting in more focus on these samples during the next round of selecting a weak classifier.
Questions tagged [adaboost]
255 questions
2
votes
2 answers
How do I use AdaBoost for feature selection?
I want to use AdaBoost to choose a good set features from a large number (~100k). AdaBoost works by iterating though the feature set and adding in features based on how well they preform. It chooses features that preform well on samples that were…

Robert
- 37,670
- 37
- 171
- 213
2
votes
1 answer
Adaptive Boosting vs. SVM
I am working on a binary classification case and comparing the performance of different classifiers. Testing the performance of the Adaboost algorithm (with decision tree as its base classifier) against SVM on multiple datasets, I found that the…

AliCivil
- 2,003
- 6
- 28
- 43
2
votes
0 answers
Should I train my weak classifier at each AdaBoost iteration?
I'm rather new to machine learning or even programming itself, so I'm sorry if questions that I'm about to ask don't make much sense. So I've been using 5 different, and not so weak classifiers (5 neural networks with error rate around 0.25-0.3),…

Milan Stefanovic
- 31
- 1
- 3
2
votes
1 answer
Using foreach for parallel boosting in R
I routinely use the foreach package for training random forests in R, and I'm trying to find a rough equivalent for training adaboost models, but I'm running into the problem of how to combine the results. The randomForest package has the 'combine'…

TomR
- 546
- 8
- 19
2
votes
0 answers
Weight response with sampsize for unbalanced data in randomForest
I am new to machine learning and R.
I tried to fit some models including trees, boosted trees, random forests, ada boosting, svm, and logistic regression with R.
In my case, probability that the rare event (class 1) occurs in the training data is…

user3365537
- 21
- 2
2
votes
3 answers
LBP Face Detection
I want to implement a face detection algorithm that does not take a lot of training time. I looked at the Viola-Jones method but the training time is too long. I read about LBP and how it is used in face detection. I want to implement in in C on a…

medz91
- 105
- 3
- 13
2
votes
1 answer
What if each round of boosting selects same Haar-feature in Viola-jones face detection method?
I am implementing Viola-Jones face detection to detect human faces. While training using Adaboost, boosting round selects the same haar feature. For example, if the selected Haar-feature (x,y,w,h,f,p) for the first three round is (0,0,4,2,1,0) ,…

user2766019
- 577
- 4
- 7
- 20
2
votes
0 answers
OpenCV: Training a soft cascade classifier
I've built an algorithm for pedestrian detection using openCV tools. To perform classification I use a boosted classifier trained with the CvBoost class.
The problem of this implementation is that I need to feed my classifier the whole set of…

Pedro Batista
- 1,100
- 1
- 13
- 25
2
votes
1 answer
Adaboost feature selection
I am trying to train an adaboost classifier using the openCV library, for visual pedestrian detection.
I've come across the notion that adaboost allows the selection of the most relevant features, meaning, if I harvest 50.000 features from images…

Pedro Batista
- 1,100
- 1
- 13
- 25
2
votes
1 answer
openCV c++: Problems working with CvBoost (Adaboost classifer)
I'm creating an application for classifying humans in images of urban setting.
I train a classifer in following manner:
int main (int argc, char **argv)
{
/* STEP 2. Opening the file */
//1. Declare a structure to keep the data
CvMLData…

Pedro Batista
- 1,100
- 1
- 13
- 25
2
votes
1 answer
Viola Jones AdaBoost running out of memory before even starts
I'm implementing the Viola Jones algorithm for face detection. I'm having issues with the first part of the AdaBoost learning part of the algorithm.
The original paper states
The weak classifier selection algorithm proceeds as follows. For each…

robev
- 1,909
- 3
- 23
- 32
2
votes
1 answer
Viola Jones Experiments (training sets)
It is said "4916 positive training examples were hand picked aligned, normalized, and scaled to a base resolution of 24x24. 10,000 negative examples were selected by randomly picking sub-windows from 9500 images which did not contain faces." In the…

Koji Ikehara
- 117
- 2
- 9
2
votes
1 answer
how does data mining technique AdaBoost work?
I came across the data mining technique AdaBoost but i can not find much information regarding how it works or any examples i can go through, can someone please shed some light in this area?
Also i would like to give prediction and exploration of…

godzilla
- 3,005
- 7
- 44
- 60
2
votes
1 answer
Decision Stumps
I'd like to implement a java application using AdaBoost wich classifies if an elephant is African or Asian Elephant. My Elephant class has fields:
int size;
int weight;
double sampleWeight;
ElephantType type; // (which can be Asian or African).
I'm…

gadzix90
- 744
- 2
- 13
- 28
1
vote
3 answers
Implementing Adaboost for multiple dimensions in Java
I'm working on AdaBoost implementation in Java.
It should have work for "double" coordinates on 2D 3D or 10D.
All I found for Java is for a binary data (0,1) and not for multi-dimensional space.
I'm currently looking for a way to represent the…

Gil Piven
- 19
- 4