Questions tagged [lime]

LIME (local interpretable model-agnostic explanations) is an explainability method used to inspect machine learning models. Include a tag for Python, R, etc. depending on the LIME implementation.

LIME (local interpretable model-agnostic explanations) is an explainability method used to inspect machine learning models and debug their predictions. It was originally proposed in “Why Should I Trust You?”: Explaining the Predictions of Any Classifier (Ribeiro et al., NAACL 2016) as a way to explain how models were making predictions in natural language processing tasks. Since then, people implemented the approach in several packages, and the technique inspired later techniques for "explainable machine learning," such as the SHAP approach.

Related Concepts

LIME Implementations

Implementations of this approach exist in several software packages.

Python

R

Further Reading

101 questions
0
votes
0 answers

KeyError: 'Label not in explanation' for LIME Image Explainer for Neural Network for binary classification

I am using the below code neural network & then trying to interpret the network using LIME import lime from lime import lime_image from skimage.segmentation import mark_boundaries model = Sequential() model.add(Conv2D(32, kernel_size=3,…
Mehul Gupta
  • 1,829
  • 3
  • 17
  • 33
0
votes
1 answer

Input array must have a shape == (..., 3)), got (299, 299, 4)

I am using a pretrained resnet50 model to validate some classes. I am using LIME to test how the model is testing this data as well. However, some of the images are not RGB and may be different formats, and I noticed that RGB arrays are value 3…
0
votes
2 answers

LIME text explainer for model with preprocessed input

I'm trying to explain a Keras LSTM model using LIME text explainer. I have news titles and a binary target variable (the sentiment). My model is the following: vocab_size = len(tokenizer.word_index) + 1 embedding_dim = 16 max_length =…
student
  • 68
  • 1
  • 11
0
votes
2 answers

Using LIME for explanation of deep neural net for fraud detection

I have built a deep neural network which classifies fraudulent transactions. I am trying to use LIME for explanation, but am facing an error from the interpretor.explain_instance() function. The complete code is as follows: import lime from lime…
Gourab
  • 81
  • 1
  • 1
  • 7
0
votes
1 answer

Spawn Sprites on Desktop [Haxe Flixel]

I am trying to spawn sprites on the desktop, the same way you would in a regular HaxeFlixel State. Kinda like the desktop goose from a while back. I have already tried to do this effect with boarder-less windows but it just isn’t rendering sprites…
0
votes
0 answers

R: LIME for variable importance with KERAS takes forever to run

I am running a KERAS neural network in R - which works! The network that I have estimated has 1.441 parameters. 22 Input variables and 11580 observations in the training set and 19.659 in the test set. Now I trying to investigate what variables are…
oxguru
  • 3
  • 2
0
votes
2 answers

LimeReport compilation: Error 127. What causes this?

I am trying to compile LimeReport in Windows 10 using Qt 5.5.9 and Qt Creator 4.11.0. I get the following compilation output and the compilation stops. /usr/bin/sh: I:\Programs\Qt\Qt5.9.9\5.9.9\mingw53_32\bin\lupdate.exe: command not…
0
votes
1 answer

(Friday Night Funkin) running lime test windows

Whenever I run the command lime test window on Windows PowerShell to build my game for fnf, it says "You must have a "project.xml" file or specify another valid project file when using the 'test' command."
0
votes
0 answers

How to use lime_tabular.LimeTabularExplainer for xgboost classifier?

I wrote a basic code for my binary classification problem. I have problems about understanding how lime works. Actually it has one hot encoders and scaler by using pipeline but, I tried to simplify the code as I couldn't progress. But I don't…
Sevval Kahraman
  • 1,185
  • 3
  • 10
  • 37
0
votes
1 answer

How to implement LIME in a Bert model?

I am new to machine learning. I noticed that such questions have been asked before as well but did not receive a proper solution. Below is the code for semantic similarity and I want to implement LIME as a base. Please, help me out. from…
Mike
  • 1
  • 2
0
votes
0 answers

What does a task with "type": "lime" do in Visual Studio Code?

I came back to an old project Visual Studio Code recently and I'm trying to figure out how it works. I opened my tasks.json file and there's a task with "type": "lime". What does this do? I looked over the tasks.json schema and according to that,…
CharType
  • 390
  • 1
  • 4
  • 16
0
votes
0 answers

In SHAP force plot, is there a way to change the value of x-axis to custom name?

In the SHAP force plot, is there a way to change the value of the x-axis to a custom name? f = plt.figure(figsize=(8, 6)) shap.summary_plot(shap_values, X_test, plot_type="bar", feature_names=X_train.columns, class_names = ORGD_Test['Class'])
0
votes
0 answers

Lime Error: LIME does not currently support classifier models without probability scores

I'm a beginner in DataScience and this is my first project. So what I want to do is quite simple, just a 0-1 classification. X=…
李雅譞
  • 1
  • 1
0
votes
0 answers

Explaining my deep learning model with LIME text explainer in python on Twitter sentiment analysis

I have a dataset of Tweets labelled with sentiments. I have pre-processed the data and done parts of speech tagging (all via NLTK in python). After preprocessing the data looks like this: Pre-processed tweets After preprocessing training data is…
kounteyo
  • 11
  • 5
0
votes
1 answer

Handling category, float and int type features while using LIME for model interpretation

I am using Lime (Local Interpretable Model-agnostic Explanations) with mixed feature types in order to evaluate my model predictions for classification task. Does anyone know how to specify binary features in lime.lime_tabular.LimeTabularExplainer()…
ML_Enthu
  • 314
  • 1
  • 3
  • 12