Questions tagged [adaboost]

AdaBoost is a meta machine learning algorithm. It performs several rounds of training in which the best weak classifiers are selected. At the end of each round, the still misclassified training samples are given a higher weight, resulting in more focus on these samples during the next round of selecting a weak classifier.

255 questions
4
votes
1 answer

Is this AdaBoost behavior correct?

I'm implementing AdaBoost as described by the Viola-Jones paper for my own edification. In the process of unit testing the algorithm I have found some strange behavior. It is possible this is just the algorithm acting strangely on canned data or…
Pace
  • 41,875
  • 13
  • 113
  • 156
4
votes
1 answer

returning models used in adaboost python

After applying adaboost on svm I want to know the models(their parametes) used in the adaboost algorithm. ada=AdaBoostClassifier(n_estimators=10, base_estimator=SVC(probability=True)) ada.fit(x_train,y_train) How can I find the models used in the…
user3171906
  • 543
  • 2
  • 9
  • 17
4
votes
0 answers

traincascade based on HOG feature

as we know, the opencv traincascade can handle all the three type features HAAR HOG and LBP I have already study the insight of HAAR and LBP features adapt AdaBoost, but I don't understand HOG part: 1. how the program handle the HOG features? 2.…
zdppoiu
  • 41
  • 2
4
votes
6 answers

adabag boosting function throws error when giving mfinal>10

I have a strange issue, whenever I try increasing the mfinal argument in boosting function of adabag package beyond 10 I get an error, Even with mfinal=9 I get warnings. My train data has 7 class Dependant variable and 100 independant variables and…
Abdul Khader
  • 141
  • 1
  • 2
  • 5
4
votes
1 answer

About adaboost algorithm

I'm working on a traffic flow prediction where I can predict that a place has heavy or light traffic. I have classified each traffic as 1-5, 1 being the lightest traffic and 5 being the heaviest traffic. I came across to this website…
aceraven777
  • 4,358
  • 3
  • 31
  • 55
4
votes
2 answers

Selecting Best Features in a feature vector using Adaboost

I've read some documentation on how Adaboost works but have some questions regarding it. I've also read that Adaboost also picks best features from data apart from weighting weak classifiers to and use them in testing phase to perform classification…
garak
  • 4,713
  • 9
  • 39
  • 56
4
votes
2 answers

AdaBoost Input and Output?

I am a non technical person, who is trying to implement image classification. In this paper, I came across the ADA Boost algorithm, which was implemented after the 'bag of features' step for video keyframes. Can someone explain in layman terms what…
Pen Watson
  • 43
  • 1
  • 4
3
votes
1 answer

Ensemble classifiers (Random Forest, Bagging, Boosting, etc.) in SSAS

I am using SSAS (SQL Server 2008 R2) to develop a classification model for a data set where 80% of values are missing. Ensemble classifiers based on trees are supposedly the best solution (Random Forest for example). Is there any nice way of adding…
3
votes
2 answers

How to calculate shap values for ADABoost model?

I am running 3 different model (Random forest, Gradient Boosting, Ada Boost) and a model ensemble based on these 3 models. I managed to use SHAP for GB and RF but not for ADA with the following error: Exception …
Shadelex
  • 41
  • 1
  • 3
3
votes
2 answers

Execution time of AdaBoost with SVM base classifier

I just made a Adaboost Classifier with these parameters, 1.n_estimators = 50 2.base_estimator = svc (support vector classifier) 3.learning_rate = 1 here is my code: from sklearn.ensemble import AdaBoostClassifier from sklearn.svm import SVC svc =…
3
votes
3 answers

How to interpret (unexpected) values of sklearn.tree.tree_tree.value attribute?

The values of the value attribute corresponding to the decision tree classifier stubs being used with an AdaBoostClassifier are not matching expectations and I can not determine what the values are indicating. I would like to understand the values…
CK215
  • 175
  • 1
  • 2
  • 7
3
votes
1 answer

AdaBoost repeatedly chooses same weak learners

I have implemented a version of the AdaBoost boosting algorithm, where I use decision stumps as weak learners. However often I find that after training the AdaBoost algorithm, a series of weak learners is created, such that this series is recurring…
H_Lev1
  • 253
  • 4
  • 18
3
votes
1 answer

unable to use Adaboost with R's caret package

I'm using R's caret package for implementing adaboost technique. But I'm getting an error while executing it. > str(my_data) 'data.frame': 3885 obs. of 10 variables: $ Date : Factor w/ 12 levels "0","1","2","3",..: 3 3 3 3 3 3 3 3 3 3 ... $…
y_sri
  • 31
  • 3
3
votes
2 answers

Why is adaboost with 1 estimator faster than a simple decision tree?

I wanted to compare adaboost and decision trees. As a proof of principle, I set the number of estimators in adaboost to 1 with a decision tree classifier as a default, expecting the same result as a simple decision tree. I indeed got the same…
3
votes
1 answer

What is an example of using Adaboost (Adaptive Boosting) approach with Decision Trees

Is there any good tutorial that explains how to weight the samples during successive iterations of constructing the decision trees for a sample training set? I want to specifically how to the weights are assigned after the first decision tree is…
London guy
  • 27,522
  • 44
  • 121
  • 179
1 2
3
16 17