1

I have my simplified model that looks like this:

model = Sequential()
model.add(LSTM(12, input_shape=(1000,12)))
model.add(Dense(9, activation='sigmoid'))

My training data has the shape:

(900,1000,12)

As you can see from the output layer I have 9 outputs, so every signal (of length 1000) will be classified into one or more of this outputs (it is a multilabel classification)

I train my model like this:

history = model.fit(X_train,y_train, batch_size=32, epochs=10,validation_data=(X_val,y_val),verbose=2)

So everything is ok so far, but now I want to use Lime to explain the classification

explainer = lime_tabular.RecurrentTabularExplainer(X_train, training_labels=y_train,feature_names=['1','2','3','4','5','6','7','8','9','10','11','12'],
                                                   discretize_continuous=True,
                                                   class_names=['a','b','c','d','e','f','g','h','i'],
                                                   discretizer='decile')

I am not getting any error when I define my explainer, but when I try to run the code below it runs for a long time befor giving me an error

exp=explainer.explain_instance(data_row=X[0].reshape(1,1000,12),classifier_fn= model)
exp.show_in_notebook()
NotImplementedError: LIME does not currently support classifier models without probability scores. 
If this conflicts with your use case, please let us know: https://github.com/datascienceinc/lime/issues/16

Can anyone recognize this error or see whats wrong?

Marco Cerliani
  • 21,233
  • 3
  • 49
  • 54
bjornsing
  • 322
  • 6
  • 25

1 Answers1

1

you should pass to classifier_fn in explainer.explain_instance a classifier prediction probability function, which takes a numpy array and outputs prediction probabilities: in your case model.predict_proba (also model.predict works if it produces probabilities).

Pay attention also that prediction probabilties do not sum to 1 in your case, because you apply a sigmoid activation in the final layer. Consider switching to softmax to produce probabilities that sum to 1

Here the full example:

Fit a dummy model

X = np.random.uniform(0,1, (50, 10, 12))
y = np.random.randint(0,1, (50, 9))

model = Sequential()
model.add(LSTM(12, input_shape=(10, 12)))
model.add(Dense(9, activation='softmax'))
model.compile('adam', 'categorical_crossentropy')
history = model.fit(X, y, epochs=3)

Initialize explainer

from lime import lime_tabular

explainer = lime_tabular.RecurrentTabularExplainer(
    X, training_labels = y,
    feature_names = ['1','2','3','4','5','6','7','8','9','10','11','12'],
    discretize_continuous = True,
    class_names = ['a','b','c','d','e','f','g','h','i'],
    discretizer = 'decile')

Explain instances:

exp = explainer.explain_instance(
    data_row = X[0].reshape(1,10,12),
    classifier_fn = model.predict)

exp.show_in_notebook()
Marco Cerliani
  • 21,233
  • 3
  • 49
  • 54