2

Let's say I have the following ParamSet object:

my_ps = paradox::ps(
    minsplit = p_int(1, 64, logscale = TRUE),
    cp = p_dbl(1e-04, 1, logscale = TRUE))

Is it possible to rename minsplit to survTree.minsplit without changing anything else?

The reason for this is that I use some learners as part of a GraphLearner and so their parameters names changed and I would like to have some code that adds the learner$id in front the parameters to use later for tuning (rather than rewriting them from scratch with the new names)

John
  • 359
  • 4
  • 19
  • 1
    Hey, sorry for the late response! I don't think that this is possible with the current API without using a somewhat hacky solution. I am also not sure whether it can be easily implemented. – Sebastian Dec 05 '22 at 08:22
  • I did try to hack it I remember from a `data.table` form but later some ids didn't match during `AutoTuner`-ing so I totally get that! – John Dec 05 '22 at 23:56

2 Answers2

3

I think I have a partial solution here. It is only partial, because it does not support the transformation.

Where it works:

library(paradox)

my_ps = paradox::ps(
  minsplit = p_int(1, 64),
  cp = p_dbl(1e-04, 1)
)

my_ps$set_id = "john"

my_psc = ParamSetCollection$new(list(my_ps))

print(my_psc)
#> <ParamSetCollection>
#>               id    class lower upper nlevels        default value
#> 1: john.minsplit ParamInt 1e+00    64      64 <NoDefault[3]>      
#> 2:       john.cp ParamDbl 1e-04     1     Inf <NoDefault[3]>

Created on 2022-12-07 by the reprex package (v2.0.1)

Where it does not:

library(paradox)

my_ps = paradox::ps(
  minsplit = p_int(1, 64, logscale = TRUE),
  cp = p_dbl(1e-04, 1)
)

my_ps$set_id = "john"

my_psc = ParamSetCollection$new(list(my_ps))
#> Error in .__ParamSetCollection__initialize(self = self, private = private, : Building a collection out sets, where a ParamSet has a trafo is currently unsupported!

Created on 2022-12-07 by the reprex package (v2.0.1)

The underlying problem is that we did not solve the problem of how to reconcile the parameter transformations of individual ParamSets and a possible parameter transformation of the ParamSetCollection

I fear that there is currently no neat solution for your problem.

Sebastian
  • 865
  • 5
  • 13
1

Sorry I can not comment yet, this is not exactly the solution you are looking for but I hope this will fix the problem you are having.

You can set the param_space in the learner, before putting it in the graph, i.e. sticking with your search space. After you create the GraphLearner regularly it will have the desired search space.

A concrete example:

library(mlr3verse)

learner = lrn("regr.rpart", cp = to_tune(0.1, 0.2))

glrn = as_learner(po("pca") %>>% po("learner", learner))

at = auto_tuner(
  "random_search",
  glrn,
  rsmp("holdout"),
  term_evals = 10
)

task = tsk("mtcars")

at$train(task)
Sebastian
  • 865
  • 5
  • 13
  • Hi Lukas, yes that would work! But is it possible? I tried to set `ps = ps(lambda = p_dbl(1e-03, 10, logscale = TRUE))` to the model `coxnet = lrn('surv.glmnet', id = 'CoxNet')` but couldn't do it - e.g. `coxnet$param_set` is read-only. Is there any other way? – John Dec 06 '22 at 00:07
  • 1
    I think what he meant by that is that you can first set the parameters with the `to_tune()` token in for the learner and when you then wrap the learner in the graph learner, the graph learner will understand how to tone the parameters of the learner – Sebastian Dec 06 '22 at 07:23
  • The problem I can't have that solution, since I want to have the parameter spaces seperately (more modular) and second some of these can't be written with `to_tune()` or would be very hard to (i.e. see this https://stackoverflow.com/questions/73527213/setting-early-stopping-rounds-in-xgboost-learner-using-mlr3 answer of yours :) – John Dec 06 '22 at 19:23