3

I am running 3 different model (Random forest, Gradient Boosting, Ada Boost) and a model ensemble based on these 3 models.

I managed to use SHAP for GB and RF but not for ADA with the following error:

Exception                                 Traceback (most recent call last)
in engine
----> 1 explainer = shap.TreeExplainer(model,data = explain_data.head(1000), model_output= 'probability')

/home/cdsw/.local/lib/python3.6/site-packages/shap/explainers/tree.py in __init__(self, model, data, model_output, feature_perturbation, **deprecated_options)
    110         self.feature_perturbation = feature_perturbation
    111         self.expected_value = None
--> 112         self.model = TreeEnsemble(model, self.data, self.data_missing)
    113 
    114         if feature_perturbation not in feature_perturbation_codes:

/home/cdsw/.local/lib/python3.6/site-packages/shap/explainers/tree.py in __init__(self, model, data, data_missing)
    752             self.tree_output = "probability"
    753         else:
--> 754             raise Exception("Model type not yet supported by TreeExplainer: " + str(type(model)))
    755 
    756         # build a dense numpy version of all the tree objects

Exception: Model type not yet supported by TreeExplainer: <class 'sklearn.ensemble._weight_boosting.AdaBoostClassifier'>

I found this link on Git that state

TreeExplainer creates a TreeEnsemble object from whatever model type we are trying to explain, and then works with that downstream. So all you would need to do is and add another if statement in the

TreeEnsemble constructor similar to the one for gradient boosting

But I really don't know how to implement it since I quite new to this.

Shadelex
  • 41
  • 1
  • 3

2 Answers2

5

I had the same problem and what I did, was to modify the file in the git you are commenting.

In my case I use windows so the file is in C:\Users\my_user\AppData\Local\Continuum\anaconda3\Lib\site-packages\shap\explainers but you can do double click over the error message and the file will be opened.

The next step is to add another elif as the answer of the git help says. In my case I did it from the line 404 as following:

1) Modify the source code.

... 
    self.objective = objective_name_map.get(model.criterion, None)
    self.tree_output = "probability"
elif str(type(model)).endswith("sklearn.ensemble.weight_boosting.AdaBoostClassifier'>"): #From this line I have modified the code
    scaling = 1.0 / len(model.estimators_) # output is average of trees
    self.trees = [Tree(e.tree_, normalize=True, scaling=scaling) for e in model.estimators_]
    self.objective = objective_name_map.get(model.base_estimator_.criterion, None) #This line is done to get the decision criteria, for example gini.
    self.tree_output = "probability" #This is the last line I added
elif str(type(model)).endswith("sklearn.ensemble.forest.ExtraTreesClassifier'>"): # TODO: add unit test for this case
    scaling = 1.0 / len(model.estimators_) # output is average of trees
    self.trees = [Tree(e.tree_, normalize=True, scaling=scaling) for e in model.estimators_]
...

Note in the other models, the code of shap needs the attribute 'criterion' that the AdaBoost classifier doesn't have in a direct way. So in this case this attribute is obtained from the "weak" classifiers with the AdaBoost has been trained, that's why I add model.base_estimator_.criterion .

Finally you have to import the library again, train your model and get the shap values. I leave an example:

2) Import again the library and try:

from sklearn import datasets
from sklearn.ensemble import AdaBoostClassifier
import shap

# import some data to play with
iris = datasets.load_iris()
X = iris.data
y = iris.target

ADABoost_model = AdaBoostClassifier()
ADABoost_model.fit(X, y)

shap_values = shap.TreeExplainer(ADABoost_model).shap_values(X)
shap.summary_plot(shap_values, X, plot_type="bar")

Which generates the following:

3) Get your new results:

enter image description here

Community
  • 1
  • 1
Henry Navarro
  • 943
  • 8
  • 34
2

It seems that the shap package has been updated and still does not contain the AdaBoostClassifier. Based on the previous answer, I've modified the previous answer to work with the shap/explainers/tree.py file in lines 598-610

### Added AdaBoostClassifier based on the outdated StackOverflow response and Github issue here
### https://stackoverflow.com/questions/60433389/how-to-calculate-shap-values-for-adaboost-model/61108156#61108156
### https://github.com/slundberg/shap/issues/335
elif safe_isinstance(model, ["sklearn.ensemble.AdaBoostClassifier", "sklearn.ensemble._weighted_boosting.AdaBoostClassifier"]):
    assert hasattr(model, "estimators_"), "Model has no `estimators_`! Have you called `model.fit`?"
    self.internal_dtype = model.estimators_[0].tree_.value.dtype.type
    self.input_dtype = np.float32
    scaling = 1.0 / len(model.estimators_) # output is average of trees
    self.trees = [Tree(e.tree_, normalize=True, scaling=scaling) for e in model.estimators_]
    self.objective = objective_name_map.get(model.base_estimator_.criterion, None) #This line is done to get the decision criteria, for example gini.
    self.tree_output = "probability" #This is the last line added

Also working on testing to add this to the package :)

m13op22
  • 2,168
  • 2
  • 16
  • 35
  • I tried this last answer but did not work for me, I added that code on "_tree.py" and the previous aswer on "pytree.py" but nothing works, can u help me @m13op22, ty. P.D. Also have the same problem with bagging. – Pablo Moreira Garcia Jul 11 '22 at 00:30