0

I want to use sklearn.ensamble's AdaBoostClassifier for a simple binary classification task. How can I use multiple, pre-fit perceptrons as the weak classifiers in an AdaBoostClassifier?

i.e.

from sklearn.ensemble import AdaBoostClassifier
from sklearn import linear_model

Xa, ya, Xb, yb #training data

#train perceptrons
perceptron_A = linear_model.Perceptron(n_iter=200)
perceptron_A.fit(Xa, ya)

perceptron_B = linear_model.Perceptron(n_iter=200)
perceptron_B.fit(Xb, yb)

# Then, can I initiate an AdaBoostClassifier with existing perceptrons? 

ada_real = AdaBoostClassifier(
    base_estimator='Perceptron', # [perceptron_A, perceptron_B]
    learning_rate=learning_rate,
    n_estimators=2,
    algorithm="SAMME.R")

Or, do I need to build the AdaBoost manually?

Rob Irwin
  • 123
  • 11
  • After dilligent searching, I did not find an out of the box solution. But I did manage to provide trained perceptrons to a custom AdaBoost implementation after reading this [AdaBoost Demo](http://www.hdm-stuttgart.de/~maucher/Python/ComputerVision/html/Adaboost.html). – Rob Irwin Dec 30 '14 at 19:24
  • he perceptron implementation in the [Machine Learning in Python](http://mlpy.sourceforge.net) performed well and were trained added to a python dictionary. Then in my AdaBoost implementation I extract the perceptron like: `perceptron = pydict['perceptron']` – Rob Irwin Dec 30 '14 at 19:37

0 Answers0