AdaBoost is a meta machine learning algorithm. It performs several rounds of training in which the best weak classifiers are selected. At the end of each round, the still misclassified training samples are given a higher weight, resulting in more focus on these samples during the next round of selecting a weak classifier.
Questions tagged [adaboost]
255 questions
0
votes
1 answer
How to implement decision trees in boosting
I'm implementing AdaBoost(Boosting) that will use CART and C4.5. I read about AdaBoost, but i can't find good explenation how to join AdaBoost with Decision Trees. Let say i have data set D that have n examples. I split D to TR training examples and…

user3785803
- 41
- 8
0
votes
1 answer
Building Strong Classifier from Weak Learners for Adaboost and Using It to Predict Data
I'm trying to understand how Adaboostworks. I know the basic idea of it. I've read many explanations of it, but I couldn't understand how to build strong classifier from the weak classifiers.
I believe the formula to build it goes something like:
c…

CannonnaC
- 3
- 4
0
votes
0 answers
Loading AdaBoostClassifier
classifier = AdaBoostClassifier(n_estimators=100, learning_rate=1.0, algorithm='SAMME.R')
try:
classifier = joblib.load("final_model_Ada1.pkl")
print "using trained model"
except:
print "building new model"
classifier.fit(X_train,…

akshita007
- 549
- 1
- 9
- 15
0
votes
0 answers
How does the OpenCV-Transcascade collecting negative samples?
e.g. -numPos 2000 -numNeg 1000 -numStages 10 -w 20 -h 20 -minHitRate 0.995 -maxFalseAlarmRate 0.2
I have some questions about collecting negative samples.
1.According to the answer of the article(opencv_traincascade Negative samples training…

Casey Wang
- 45
- 1
- 8
0
votes
2 answers
Why are negative images used in training?
While training a classifier why are we using negative or background images? How are they used in training an object classifier?
And can anyone explain the general procedure how training is being done using any programming language like MATLAB?

Stud
- 1
- 1
0
votes
1 answer
How to implement weight in adaboost?
AdaBoost need to update weight for different data points. But most machine learning algorithm doesn't consider the weight of data. So is there a common way to implement weight for machine learning algorithm like SVM or neural network?

modkzs
- 1,369
- 4
- 13
- 17
0
votes
1 answer
Accessing and modifying OpenCV Decision Tree Nodes when using Adaboost
I am learning a boosted tree from 30000 randomly generated features. The learning is limited to only say the best 100 features. After learning how do I extract from the CvBoost object, the indexes of the features used by the decision tree.
My…

rocklegend
- 81
- 11
0
votes
0 answers
OPencv error - NCV Asssertion failed. Need solutions to correct this
OPenCV Error : GPU Api call NCV Assertion failed : NCV Stat=28, file=...\cascadeclassifier.cpp line=168> in unknown function, file ....\cascadeclassifier.cpp line 202
I am getting this error when i run my detection program
0
votes
1 answer
unable to install R library in azure ml
I have been trying to install a machine learning package that I can use in my R script.
I have done placed the tarball of the installer inside a zip file and am doing
install.packages("src/packagename_2.0-3.tar.gz", repos = NULL, type="source")…

tubby
- 2,074
- 3
- 33
- 55
0
votes
0 answers
How to train an adaboost classifier?
I intend to differentiate between different object by using the Haar like base adaboost classifier; I know it only says if it is the object or it is not, so i would have to train one for each kind of object, right?
However, I am not able to find a…

Kailegh
- 199
- 1
- 13
0
votes
0 answers
Ada in R giving me single classification
I am using the function ada in R, and I'm having a little difficulty. I have training data that looks like this
V13 V15 V17 V19
1 0.017241379 0.471264368 0.01449275 0.24637681
2 0.255813953 0.011627907 0.06849315…

user3799576
- 183
- 1
- 2
- 10
0
votes
1 answer
Adaboost Cascade TPR and FPR
When we use AdaBoost for object detection we need to set TPR and FPR for each stage (iteration of AdaBoost).
We need high TPR and low FPR.
As I understand as a result we have:
total TPR = (stage1 TPR)(stage2 TPR)...(stageN TPR)
for example…

mrgloom
- 20,061
- 36
- 171
- 301
0
votes
1 answer
Adaboost Implementation with Decision stump
I have been trying to implement Adaboost using decision stump as weak classifier but i do not know how to give preference to the weighted miss classified instances?

user3038011
- 1
- 1
0
votes
1 answer
Python : Get rules from AdaBoostClassifier
I am using an AdaBoostClassifier in Python (from sklearn.ensemble import AdaBoostClassifier) , and i would like to know the weak rules that are chosen by AdaBoost.
This is my source code :
x = np.array(p_values_learn) #Array of 10.000 * 100.000…

Pythonas
- 1
- 1
0
votes
1 answer
AdaBoosting with several different base estimators at once
I know you can AdaBoost with multiple instances of a single model (e.g., 600 Decision Trees, Bayesian Ridges, or Linear Models). Is it possible to AdaBoost with a gauntlet of models at the same time, and how?
AdaBoost([DecisionTree, BayesianRidge,…

Clayton
- 3
- 3