A technique in cross-validation where the data is partitioned into k subsets (or "folds"), where the first k-1 folds are used for training and the last fold for evaluation. The process is repeated k times, leaving out a different fold for evaluation each time.
Questions tagged [k-fold]
284 questions
2
votes
0 answers
Multiple evaluation metrics in classification using caret package
I am using caret to tune an MLP in a 10-fold CV (repeated 5 times). I would like to obtain the prSummary (F1, Precision, Recall) as well as the standard accuracy and kappa scores in the summary output.
With the caret::defaultSummary() I get the…

Björn
- 1,610
- 2
- 17
- 37
2
votes
0 answers
How to implement kfold cross validation in hmmlearn?
The hmmlearn tutorial demonstrates how a Hidden Markov Model can be fitted to a dataset:
model = hmm.GaussianHMM(n_components=3, covariance_type="full", n_iter=100)
model.fit(X)
Is there a built-in way to do cross validation, Or do I have to do…

Oblomov
- 8,953
- 22
- 60
- 106
1
vote
0 answers
TypeError: Expected sequence or array-like, got K-Fold on Transfer Learning
I'm doing image classification and coded a model using transfer learning. Now I need to perform a K-Fold analysis on it but I get above error. Is this not possible? I found almost nothing online.
I load my data with…

Mako
- 33
- 4
1
vote
1 answer
what is the correct way to apply a feature selection method to an imbalanced dataset?
I am new to data science & machine learning, so I'll write my question in detail.
I have an imbalanced dataset (binary classification dataset), and I want to apply these methods by using Weka paltform:
10-Fold cross validation.
Oversampling to…

Muneera
- 11
- 2
1
vote
0 answers
Why the accuracy is high but the result for confusion matrix is bad?
I have trained a vgg16 model with a total of 1000 images for 5 classes (200 images for each class). I have used data augmentation, stratified K-fold, and dropout to train the model. The train accuracy and val accuracy is good. However, when i do…

kar boon
- 11
- 1
1
vote
0 answers
k-fold implementation with train test split
I am trying to put kfold to my code as overfitting is an issue.
Previously i have split my data into train test .
But i am getting confused where and how to apply k-fold as my data is already split.
x_norm = preprocessing.normalize(x,…

luffy
- 11
- 3
1
vote
0 answers
cross_val_score and LassoCV.score() produce different r2 scores
I thought those two methods should produce similar scores but then I got different scores. Here are my codes:
#prepare the data and model
X_train,X_test,Y_train,Y_test = train_test_split(X,Y,train_size = 0.7,test_size = 0.3,random_state = 10)
kf =…

juneliu
- 11
- 2
1
vote
1 answer
How to get k-fold cross validation final model with sklearn
Once I iterated on each training combination, given the k-fold split, I can estimate mean and standard deviation of models performance but I actually get k different models (with their own fitted parameters). How do I get the final, whole model? Is…

Foolvio
- 15
- 3
1
vote
1 answer
My k-fold cross validation technique is giving error on my dataframe with deleted rows
I hope this message finds you well. I have been working with a dataframe and I had to remove the rows which contained any null values. I used the following command to delete such rows.
I have used the following…

Sam
- 65
- 2
- 9
1
vote
1 answer
Model Evaluations (Precision,Recall, F1 Score) using Stratified K-Fold Cross Validation Machine Learning
I have a data Set on which i have applied Stratified K Fold Cross Validation and split the data into 5 folds. Then i have applied Logistic Regression.
For Evaluation i have got precision recall and f1 score for each fold.
Finally i have to report…

Muhammad Usman
- 11
- 2
1
vote
0 answers
Cannot fit a Model after Performing Stratified K-Fold Split
I am new to the concept of using K-folds to split into train and test data, which I am practicing with the dataset below.
Context:
The Dataset is the Kaggle UrbanSound8k set available at https://www.kaggle.com/datasets/chrisfilo/urbansound8k
I am…

ShrunkenDown
- 29
- 1
1
vote
1 answer
Reset the weights in K-fold cross validation
In k-fold cross validation why we need to reset the weights after each fold
we use thia function
def reset_weights(m):
if isinstance(m, nn.Conv2d) or isinstance(m, nn.Linear):
m.reset_parameters()
so we reset the weights of the model so that each…

adel_hany1
- 11
- 1
1
vote
1 answer
Cross-validating KNN using K-fold
When using KNN to predict price how do you use K-fold to cross-validate?
My current code to predict is
library("tidyverse")
library("FNN")
library("forecast")
library("caret")
library("stats")
houses=read_csv("data.csv")
houses = subset(houses,…

Danny Warner
- 21
- 1
- 3
1
vote
1 answer
How do I get the training accuracies for each fold in k-fold cross validation in R?
I would like to evaluate whether the logistic regression model I created is overfit. I'd like to compare the accuracies of each training fold to the test fold, but I don't know how to view these in R. This is the k-fold cross validation…
user17047272
1
vote
1 answer
Nested cross validation: how does the outer loop work?
(This is a copy post from the cv stack exchange, but just putting it here as well)
I am planning to implement nested cross-validation, but just had a question about its operation. I know there are lots of posts about nested cv, but none of them (as…

Rocky the Owl
- 325
- 1
- 2
- 11