A technique in cross-validation where the data is partitioned into k subsets (or "folds"), where the first k-1 folds are used for training and the last fold for evaluation. The process is repeated k times, leaving out a different fold for evaluation each time.
Questions tagged [k-fold]
284 questions
0
votes
1 answer
Split dataset into 5~fold for cross-validation
I have a dataset that I want to split into 5-fold (distinct), instead of traditional 80-20 split.
So for example:
X = pd.DataFrame({'a': [1, 3, 5, 7, 4, 5, 6, 4, 7, 9],
'b': [3, 5, 6, 2, 4, 6, 7, 8, 7, 8],
'c':…
user12587364
0
votes
1 answer
K fold cross validation---KeyError: '[] not in index'
I am facing issues on applying k fold. Please someone help me in doing this. When I apply train_test_split it doesnot create issues but k-fold is creating trouble regarding indexes.
how to apply k fold in my dataset?
my code is like that
from…

geeti
- 3
- 3
0
votes
1 answer
ValueError: Found input variables with inconsistent numbers of samples: [140, 70]
I am trying to create one machine learning model using Kernel ridge regression with k-fold but I am getting the below error. Much appreciate for your informations-
datasetTrain = pd.read_csv('D:/set_AB.csv')
datasetTest =…

NNN751
- 1
0
votes
0 answers
How to aggregate the prediction of StratifiedKFold split for more than one classifier
This is my code:
fold = StratifiedKFold(10, shuffle= True, random_state=42)
score = []
cat_prediction = []
lgbm_prediction = []
forest_prediction = []
rgb_prediction = []
oldList = []
for train_s, test_s in tqdm(fold.split(X, Y)):
xtrain,…

Onwuka Daniel
- 105
- 1
- 7
0
votes
1 answer
Calculate accuracies for k = 1:10 on cv with 5 folds using 2 given loops
This is the problem instructions I was given.
Build a K-NN classifier, use 5-fold cross-validation to evaluate its performance based on average accuracy.
Report accuracy measure for k = 2, ..., 10
Write your code below (Hint: you need a loop…

FannyPackFanatic
- 64
- 7
0
votes
0 answers
Grouped K-fold Cross validation in R
I'm wondering if I can get some assistances on how I can employ K-fold cross validation to data that needs to be split into groups. I know how to employ the K-fold cross validation for standard data where each row is an independent event, however in…

Whippa42
- 1
0
votes
1 answer
how to do fine tune SVM hyperparameter with Kfold
I would like to use Gridsearch in the code to fine tune my SVM model, I have copied this code from other githubs and it has been working perfectly fine for my cross-fold.
X = Corpus.drop(['text','ManipulativeTag','compound'],axis=1).values # !!!…

deLaJU
- 45
- 1
- 8
0
votes
1 answer
stepAIC number of rows in use has changed error - R
I created a list of linear models after performing k-fold cross validation. When I then use
map(modList, ~ stepwise(., direction = "backward",criterion = "AIC"))
I get this error:
Error in stepAIC(mod, scope = list(lower = lower, upper = upper),…

a.hesse
- 53
- 7
0
votes
0 answers
ValueError: shapes (5,640) and (26,26) not aligned: 640 (dim 1) != 26 (dim 0)
I used an extreme learning machine (ELM) model for predicting as a regression. I used K-fold to validate model prediction. But after executing the following code I get this message error:
ValueError: shapes (5,640) and (26,26) not aligned: 640 (dim…

sera
- 63
- 5
0
votes
1 answer
Not sure why do I get an error when doing k-Fold Cross validation
I'm learning neural networks and I copied a code example but I'm not sure why do I get an error.
Here is my code
df = pd.read_csv('games.csv')
df =df.dropna()
X = df[['Goals', 'Saves', 'Wins', 'Games']]
Y = df['Shots']
seed =…

Jerry
- 85
- 13
0
votes
1 answer
KeyError: "None of [Int64Index([112, 113,..121,\n .\n 58, 559],\n dtype='int64', length=448)] are in the [columns]"
I used an extreme learning machine (ELM) model for predicting. I used K-fold to validate model prediction. But after executing the following code I get this message error:
KeyError: "None of [Int64Index([112, 113, 114, 115, 116, 117, 118, 119, 120,…

Merna
- 13
- 5
0
votes
1 answer
Cross validation (K-fold) for Extreme Learning Machine (ELM)
I used an extreme learning machine (ELM) model for predicting. I used the training dataset and testing dataset and I want to validate the model by using cross-validation (K-fold). How can I add code to make cross-validation…

sera
- 63
- 5
0
votes
1 answer
How is the dataset divided in kfold cross validation with differing values of k?
This is the basic code I used with varying values of k (3,4,5,6)
from numpy import array
from sklearn.model_selection import KFold
# data sample type(data)
data = array([0.1, 0.2, 0.3, 0.4, 0.5, 0.6])
# prepare cross validation
k = 6
kfold =…

user1194497
- 45
- 1
- 7
0
votes
1 answer
"DLL load failed: The specified module could not be found" error in importing "kfold" from sklearn
here I'm encountering strange behavior with VSCode Jupyter notebook!
i can't import from sklearn.model_selection import KFold cause it give me ImportError like this:
ImportError Traceback (most recent call…

Shayan
- 5,165
- 4
- 16
- 45
0
votes
2 answers
Train and Test dataset are changing for k-fold cross validation so the accuracy is changed in naive bayes classifier
I am trying to use naive bayes classifier code from here.
I am using 5 fold for dataset.The problem is, test and train dataset are changing for each fold, so the accuracy is also changing for each execution. But I need a fixed accuracy result. I am…

user3176335
- 192
- 1
- 1
- 14