Recursive Feature Elimination. This algorithm implements backwards selection of predictors based on predictor importance ranking. The predictors are ranked and the less important ones are sequentially eliminated prior to modelling. The goal is to find a subset of predictors that can be used to produce an accurate model.
Questions tagged [rfe]
161 questions
0
votes
0 answers
IndexError:index 58 is out of bounds for size 58
I'm using RFECV to train some data to get the best accuracy with the appropriate number of features.But I kept getting the same error as mentioned in the title.Below is the code.
import matplotlib.pyplot as plt
from sklearn.svm import SVC
from…

zhen zhou
- 1
- 1
0
votes
1 answer
Caret RFE to deal to dummy variables that are levels of the same categorical variable
I have a classification problem and one of the predictors is a categorical variable X with four levels A,B,C,D that was transformed to three dummy variables A,B,C. I was trying to use the Recursive Feature Selection (RFE) in the caret package to…

ybeybe
- 149
- 1
- 12
0
votes
1 answer
R caret's rfe using lrFuncs resulting in [Error in { :task 1 failed - "rfe is expecting 58 importance values but only has 48"]
I'm having a similar issue to this post when I try to do rfe using lrFuncs. I tried their suggestions but they did not resolve my issue. Let's take the GermanCredit dataset in the caret package as an example. In this dataset all the factors (except…

Gaurav Bansal
- 5,221
- 14
- 45
- 91
0
votes
1 answer
How to use ROC metric with RFE in caret
How do I use use ROC metric for ref? I've tried this below and I get warning that 'Accuracy' is used intend of ROC.
rfFuncs$SummaryFunction <- twoClassSummary
ctrl_rfe <- rfeControl(method = "cv",
number = 5,
…

Fred R.
- 557
- 3
- 7
- 16
0
votes
1 answer
Why does the accuracy of classification drop with the increase of features used when using RFECV in scikit-learn?
Could anyone please explain me why the accuracy of classification drops with the increase of features used in recursive feature elimination with cross-validation in Scikit-learn? From the example reported in Scikit-learn documentation here:…

YuriTheFury
- 3
- 1
- 3
0
votes
1 answer
Plotting Recursive feature elimination (RFE) with cross-validation with a Decision Tree in scikit-learn
I would like to plot the "Recursive feature elimination with cross-validation" using a Decision Tree and kNN in SciKitLearn, as documented here
I would like to implement this in the classifiers that I am already working with to output both results…

owwoow14
- 1,694
- 8
- 28
- 43
-1
votes
1 answer
Why doesn't the total number of features from data.shape agree with those that are shown during recursive feature elimination?
Forgive me if I'm missing something obvious, I'm a relative newby to both Python and ML (and a new poster here, the trifecta of ignorance). Anyway, data.shape is telling me that me dataset is (150,177), however there are only 175 "False" or "True" …

NeuroJoe
- 1
- 1
-1
votes
1 answer
Calculating RFE for Recursive Feature Elimination
I have a dataframe called "dataset_con_enc".
dataset_con_enc.head()
OFFER_TYPE_PROXY OPPLINE_PRODUCT_BU OPPLINE_PRODUCT_FAMILY OPP_FLAG_LED_BY_PARTNER OPP_SOURCE target
0 0 0 8 0 19 2 1 11 137 1 ... 0 …

Nasri
- 525
- 1
- 10
- 22
-1
votes
1 answer
Recursive feature elimination and variable selection
How to get OA and Kappa value for each variable like this table in the figure below?
The study used RFE using Caret

MSilvy
- 3
- 6
-2
votes
1 answer
What is the weka equivalent to caret's rfe?
I am working with weka and have to perform an attribute selection on my dataset. A former coworker did this once with rfe from the caret package of R. What would be the equivalent function to rfe in weka? I am no statistician, so maybe this question…

aldorado
- 4,394
- 10
- 35
- 46