I have just started learning mlr3 and have read the mlr3 book (parameters optimization). In the book, they provided an example for the nested hyperparameters but I do not know how to provide the final prediction i.e. predict (model, test data). The following code provides learner, task, inner resampling (holdout), outer-resampling (3-fold CV), and grid search for tuning. My questions are:
(1) Dont we need to train the optimized model i.e. at in this case like train(at, task) ?
(2) After train, how to predict the data with test data as I am not seeing any splits of train and test data?
The code taken from mlr3 book (https://mlr3book.mlr-org.com/nested-resampling.html) is as follows:
library("mlr3tuning")
task = tsk("iris")
learner = lrn("classif.rpart")
resampling = rsmp("holdout")
measure = msr("classif.ce")
param_set = paradox::ParamSet$new(
params = list(paradox::ParamDbl$new("cp", lower = 0.001, upper = 0.1)))
terminator = trm("evals", n_evals = 5)
tuner = tnr("grid_search", resolution = 10)
at = AutoTuner$new(learner, resampling, measure = measure,
param_set, terminator, tuner = tuner)
rr = resample(task = task, learner = at, resampling = resampling_outer)