Questions tagged [cross-validation]

Cross-Validation is a method of evaluating and comparing predictive systems in statistics and machine learning.

Cross-Validation is a statistical method of evaluating and comparing learning algorithms by dividing data into two segments: one used to learn or train a model and the other used to validate the model.

In typical cross-validation, the training and validation sets must cross-over in successive rounds such that each data point has a chance of being validated against. The basic form of cross-validation is k-fold cross-validation.

Other forms of cross-validation are special cases of k-fold cross-validation or involve repeated rounds of k-fold cross-validation.

2604 questions
1
vote
0 answers

SVM for HOG features on Matlab

I am doing a SVM classification problem on Matlab. My features are HOG features (length = 4356). My procedure is as follows. 1.extract 200 positive windows and 200 negative windows. 2.extract HOG features of the above samples. 2. scale the…
1
vote
1 answer

What is a "class label" re: databases

Large volumes of literature about data mining specify the existence (or absence) of "class labels" in databases; separate from tuples and attributes. What exactly are they referring to?
Stumbler
  • 2,056
  • 7
  • 35
  • 61
1
vote
1 answer

MATLAB neural network weight initialization in multiple loops

First check this link : http://www.mathworks.com/matlabcentral/newsreader/view_thread/331830#911882 This a proposed method to create a neural network with train/test/validation data sets. I have a optimization algorithm to optimize neural network…
Eghbal
  • 3,892
  • 13
  • 51
  • 112
1
vote
0 answers

k-fold cross validation of prediction error using mgcv

I would like to evaluate the performance of a GAM at predicting novel data using a five-fold cross-validation. Model training is based on a random subset of 80% of the data and the test set the remaining 20%. I can calculate mean square prediction…
akbreezo
  • 115
  • 1
  • 1
  • 10
1
vote
1 answer

Building parallel GBM models using cross-validation in R

The gbm package in R has a handy feature of parallelizing cross-validation by sending each fold to its own node. I would like to build multiple cross-validated GBM models running over a range of hyperparameters. Ideally, because I have multiple…
Amw 5G
  • 659
  • 5
  • 16
1
vote
2 answers

Difference of "Training Data Set", "Testing Data Set" and "Validation Data set"

I have 250 human face images and with those I am going to train the model. for the sake of convenience, what I am going to do is to pick first 10 images and use leave-one-image-out cross validation to train the model so that each image gets the…
Nishi
  • 85
  • 2
  • 7
1
vote
0 answers

PLS-DA Bootstrapping done faster in R

Please i am trying to do bootstrapping cross-validation for PLS-DA classification. i have to repeat this procedure for six (6) different scaling methods each for different datasets. The problem is each is taking over 2 hours to complete. Please,…
1
vote
2 answers

How to get the folds themselves that are partitioned internally in sklearn.cross_validation.cross_val_score?

I'm using: sklearn.cross_validation.cross_val_score to make a cross validation and get the results of each run. The output of this function is the scores. Is there a method to get the folds (partitions) themselves that are partitioned internally…
eman
  • 195
  • 1
  • 1
  • 8
1
vote
2 answers

sklearn.cross_validation.cross_val_score multiple cpu?

I am trying to get a score for a model through cross validation with sklearn.cross_validation.cross_val_score. According to its documentation, the parameter n_jobs sets the number of cpus that you can utilize. However, when I set it to -1 (or other…
K.Chen
  • 1,166
  • 1
  • 11
  • 18
1
vote
2 answers

Grails cross-class validation

I try to validate two language fields from two different objekts. I found Grails Validation and so i created: class Test { Title title Summary summary static contraints ={ title validator: { val, obj -> if…
Eddy2Go
  • 19
  • 3
1
vote
1 answer

Stop priniting the Accurancy while cross validation in SVM in LIBSVM

I am using cross validation in svmtrain in LIBSVM. How can I make it stop printing the "Cross Validation Accuracy" in the consol? Thank you
1
vote
1 answer

Sci-kit Learn PLS SVD and cross validation

The sklearn.cross_decomposition.PLSSVD class in Sci-kit learn appears to be failing when the response variable has a shape of (N,) instead of (N,1), where N is the number of samples in the dataset. However, sklearn.cross_validation.cross_val_score…
Mike C
  • 1,959
  • 2
  • 17
  • 17
1
vote
2 answers

Can i possible create cross validation to ID3 algorithm in accord.net?

A snapshot of my code: (full version: http://pastebin.com/7ALhSKgX) var crossvalidation = new CrossValidation(size: data.Rows.Count, folds: 7); crossvalidation.Fitting = delegate(int k, int[] indicesTrain, int[]…
Thinker
  • 99
  • 1
  • 3
  • 13
1
vote
1 answer

R, Caret: how do I specify train and holdout (validation) sets?

I have a data set and would like caret to train and validate on a specific part of my data set only. I have two lists train.ids <- list(T1=c(1,2,3), T2=c(4,5,6), T3=c(7,8,9)) and test.ids <- list(T1=c(10,11,12), T2=c(13,14,15),…
user3583077
1
vote
1 answer

KNN Classifier using cross validation

I am trying to implement KNN classifier using the cross validation approach where I have different images of a certain character for training(e.g 5 images) and another two for testing. Now I get the idea of the cross validation by simply choosing…
omarsafwany
  • 3,695
  • 8
  • 44
  • 75