Questions tagged [lime]

LIME (local interpretable model-agnostic explanations) is an explainability method used to inspect machine learning models. Include a tag for Python, R, etc. depending on the LIME implementation.

LIME (local interpretable model-agnostic explanations) is an explainability method used to inspect machine learning models and debug their predictions. It was originally proposed in “Why Should I Trust You?”: Explaining the Predictions of Any Classifier (Ribeiro et al., NAACL 2016) as a way to explain how models were making predictions in natural language processing tasks. Since then, people implemented the approach in several packages, and the technique inspired later techniques for "explainable machine learning," such as the SHAP approach.

Related Concepts

LIME Implementations

Implementations of this approach exist in several software packages.

Python

R

Further Reading

101 questions
0
votes
0 answers

InvalidIndexError when using LIME Tabular Explainer for Regression Task

I am currently working on a regression task using the LIME package in Python. I am trying to use the LimeTabularExplainer to help interpret my model's predictions. Here is how my data is structured: X_train and X_test have dimensions of (752, 15)…
0
votes
0 answers

IndexError: index 15 is out of bounds for axis 1 with size 15

Hello I am trying to incorporate LIME and GradCAM for my code that trains a deep learning model using Keras. A lot of this code if old and taken from a github repository for some research I am doing. Due to this, many of the libraries are outdated.…
0
votes
0 answers

Applying your Fine Tuned BERT on LIME with error 'collections.OrderedDict'

I have one model training and testing on XLMRobertaModelSequenceClassification. I used two labels (class 1 and 0) for classifying.Now I would like to use this model (called model_08 on script) for LIME. But I have one error TypeError:…
0
votes
0 answers

How to map different values for classification to binary with LIME?

I use a dataset for white Whine and one column "Quality" of it for classification purpose. I want to map the results 4 and 5 as "bad" and 6 to 8 as "good" classification. Right now it only maps 4 to "bad" and 5 to "good" and the rest as "Other" (see…
0
votes
0 answers

LIME does not show the features' scores for positive class explanation

i'm trying to explain the prediction for binary text classification model (LSTM&GRU using BERT embedding) with LIME, it highlights the features for both classes, but it shows the score for each feature just for class 0. even if i tried different…
Rarai
  • 1
0
votes
0 answers

Why does Lime need training data to compute local explanations

I am using Lime to compute local explanation, however I do not understand why do I have to pass training data X_train in the below line of code explainer = lime_tabular.LimeTabularExplainer(X_train, mode="regression", feature_names=…
learnToCode
  • 341
  • 4
  • 14
0
votes
1 answer

How can LIME explanations for tabular data be created for a Sequential Keras model?

How can LIME explainability method (specifically LimeTabularExplainer) be used to explain a neural network (Sequential Keras model)? I’m working with Adult dataset (binary classification of tabular data). I encode it with One-Hot-Encoding using…
Kainsha
  • 1
  • 1
0
votes
0 answers

How would I use the FlxTrailArea to move an FlxTrail's x and y to be relevant with an FNF character's position similar to the Ronald Mcdonald mod?

FlxTrailArea official explanation <- FlxTrailArea FlxTrail official explanation <- FlxTrail YouTube gameplay footage <- The video footage of the described trail movement (red afterimage) This is relevant to FNF (Friday Night Funkin') and…
The Drew
  • 1
  • 2
0
votes
0 answers

Emoji features used in the sentiment analysis are not showing on my LIME figures

I'm a student from a business school, and trying to learn how to use advanced data analytics. My model uses Bilstm to conduct a sentiment classification task. The dataset consists of 80000 tweets. The features includes words and emojis. I'm trying…
0
votes
0 answers

LIME Explainability with 4D training data on Python

I'm dealing with a Deep learning model requiring as input a 4D array and I would like to use LIME to get some explainability on my results I'm using lime_tabular.RecurrentTabularExplainer but it requires as training data a 3D numpy array, but my…
Silvia
  • 1
  • 1
0
votes
0 answers

IndexError: index 1 is out of bounds for axis 1 with size 1 when using python lime

I am trying to validate my ANN built on tabular data with XAI methods to improve transparency. But when I am using the lime method I am getting this error, and I am not able to debug it. Kindly help me as I am new in this field. I want to make the…
0
votes
1 answer

Ignore transparent pixels on sprite

So I need the sprite to get bigger when I hover over the sprite with the mouse. But the problem is that the sprite has an unusual shape, which leaves a lot of transparent space. And the mouse reacts to this transparent space. override public…
Pyr0Guy
  • 21
  • 3
0
votes
0 answers

Implementing Lime with Sequential three-D Keras model

I want to implement LIME explainer to describe a three-dimensional Keras Sequential model (n_samples, n_timesteps, n_features). I cannot understand whether I have to use Lime "RecurrentTabularExplainer" or "LimeTabularExplainer". Also I cannot…
John89
  • 1
  • 1
0
votes
2 answers

Error while importing lime_tabular package

I am trying to install lime_tabular but this message occurs and I cannot solve it: ERROR: Could not find a version that satisfies the requirement lime_tabular (from versions: none) ERROR: No matching distribution found for lime_tabular I have…
0
votes
0 answers

Getting error while implementing LIME with Catboost

I am trying to learn how to implement LIME library on categorical features. For that I am trying to replicate a Git notebook on my data set. However I am facing an error for which the same function worked for my reference notebook. Here is the…
Asmita
  • 13
  • 6