0

I am using the folowing codes to train a xgboost model:

caret::trainControl(
      method = "repeatedcv", # cross-validation
      number = 5, # with n folds 
      repeats = 1,
      p = 0.6,
      #index = createFolds(tr_treated$Id_clean), # fix the folds
      verboseIter = FALSE, # no training log
      allowParallel = TRUE # FALSE for reproducible results 
    )    
    xgb_tune <- caret::train(
      x = features_train,
      y = response_train,
      trControl = tune_control,
      tuneGrid = hyper_grid,
      method = "xgbTree",
      verbose = TRUE,
      verbosity = 0
    )

Details for the grid are not important for my question.

Is there a possibility to get the residuals of each test partition? Or better is there a possibilty to get the standard deviation of the residuals, which weren't used to train in each cv iteration?

I tried to ask Chatgpt for an answer, but don't want to write my own cv.

So maybe someone know how to help me?

0 Answers0