Questions tagged [adaboost]

AdaBoost is a meta machine learning algorithm. It performs several rounds of training in which the best weak classifiers are selected. At the end of each round, the still misclassified training samples are given a higher weight, resulting in more focus on these samples during the next round of selecting a weak classifier.

255 questions
2
votes
1 answer

AdaBoostClassifier and the 'SAMME.R’ Algorithm

It takes a while to get to the actual question, so please bear with me. The AdaBoost documentation states that it " is a meta-estimator that begins by fitting a classifier on the original dataset and then fits additional copies of the classifier on…
Jack Fleeting
  • 24,385
  • 6
  • 23
  • 45
2
votes
1 answer

Why does AdaBoost not work with DecisionTree?

I'm using sklearn 0.19.1 with DecisionTree and AdaBoost. I have a DecisionTree classifier that works fine: clf = tree.DecisionTreeClassifier() train_split_perc = 10000 test_split_perc = pdf.shape[0] - train_split_perc train_pdf_x =…
Radu Gheorghiu
  • 20,049
  • 16
  • 72
  • 107
2
votes
1 answer

How to use AdaBoost on multiple different types of fitted classifiers (like SVM, Decision Tree, Neural Network, etc.)?

I'm working on a classification problem and have multiple fitted sklearn classifiers, like svm = SVC().fit(X_train, y_train) dt = tree.DecisionTreeClassifier(criterion='entropy',max_depth=4000).fit(X_train, y_train) ... for i in…
Daniel P
  • 165
  • 7
2
votes
0 answers

loss parameter explanation for "sklearn.ensemble.GradientBoostingClassifier"

I was training Gradient Boosting Models using sklearn's GradientBoostingClassifier [sklearn.ensemble.GradientBoostingClassifier] when I encountered the "loss" parameter. The official explanation given from sklearn's page is- loss : {‘deviance’,…
Arun
  • 2,222
  • 7
  • 43
  • 78
2
votes
0 answers

Boosting algorithms with keras

Is there any way I can use boosting algorithms like adaboost with keras for ImageNet using CNN? Scikit learn has a number of such algorithms but it appears the inputs for these models are different in there. I have manually written a code for…
Subham Mukherjee
  • 779
  • 1
  • 7
  • 13
2
votes
2 answers

Classification results depend on random_state?

I want to implement a AdaBoost model using scikit-learn (sklearn). My question is similar to another question but it is not totally the same. As far as I understand, the random_state variable described in the documentation is for randomly splitting…
kensaii
  • 314
  • 5
  • 16
2
votes
2 answers

sklearn Boosting: cross-validation to find optimal number of estimators without restarting everytime

In Python sklearn ensemble library, I want to train my data using some boosting method (say Adaboost). As I would like to know the optimal number of estimators, I plan to do a cv with different number of estimators each time. However, it seems doing…
Camuslu
  • 123
  • 1
  • 3
  • 13
2
votes
0 answers

Adaptive Boosting - Visualize Tree

I'm using the Weka api in Eclipse for classification using Adaptive Boosting and J48 as the base classifier. I know that we can visualize tree if we are to use just the J48 algorithm alone as the classifier without boosting. So in this case of…
Jerry
  • 410
  • 6
  • 17
2
votes
1 answer

Adaboost in R: Predict for data that does not have dependent variable

I tried to use boosting in R from adabag package. library(adabag) model = boosting(survived ~ ., data=train, boos=TRUE, mfinal=20) # Now I tried to predict using the model for test dataset like this: pred = predict(model,test[-1],type =…
2
votes
1 answer

Why does AdaBoostClassifier with SVM work worse

By working worse, I mean even a higher training error. # Boosted SVC clf = AdaBoostClassifier(base_estimator=SVC(random_state=1), random_state=1, algorithm="SAMME", n_estimators=5) clf.fit(X, y) # Only SVC clf = SVC() clf.fit(X, y) My training…
beaver
  • 550
  • 1
  • 9
  • 23
2
votes
1 answer

Why estimator_weight in SAMME.R AdaBoost algorithm is set to 1

I am new to AdaBoost algorithm. In sklearn SAMME algorithm's _boost_discrete() returns classifiers weight as "estimator_weight" def _boost_discrete(self, iboost, X, y, sample_weight): ....... return sample_weight, estimator_weight,…
Jaeger
  • 159
  • 4
  • 14
2
votes
1 answer

how does sklearn's Adaboost predict_proba works internally?

I'm using sklearn's 'predict_proba()' to predict the probability of a sample belonging to a category for each estimators in Adaboost classifier. from sklearn.ensemble import AdaBoostClassifier clf = AdaBoostClassifier(n_estimators=50) for estimator…
2
votes
1 answer

How to see the prediction of each base estimator of adaboost classifier in sklearn ensamble

i can see the prediction using AdaBoostClassifier of ensemble method of sklearn using code like this. from sklearn.ensemble import AdaBoostClassifier clf = AdaBoostClassifier(n_estimators=100) clf.fit(X_train, y_train) y_pred=…
Jaeger
  • 159
  • 4
  • 14
2
votes
1 answer

Why does this trivially learnable example break AdaBoost?

I'm testing out a boosted tree model that I built using Matlab's fitensemble method. X = rand(100, 10); Y = X(:, end)>.5; boosted_tree = fitensemble(X, Y, 'AdaBoostM1', 100,'Tree'); predicted_Y = predict(boosted_tree, X); I just wanted to run it…
Cecilia
  • 4,512
  • 3
  • 32
  • 75
2
votes
0 answers

OpenCV Adaboost: "The function/feature is not implemented"

I have feature vectors of some objects from two classes and my goal is to train a boosted classifier with this information. After looking at the documentation and the letter recognition example I thought it would be straight forward to do so, but…
thomas
  • 624
  • 1
  • 12
  • 27