I have a binary classification XGBTree model. The data frame used to train the model contains many independent variables(x) and I want to optimize one x to improve the chances that the result becomes 1.
I wonder how it can be achieved? I have searched for default optim function but seems like it can only solve equation but XGBTree model does not have an equation for me to enter. Same as Gurobi, many examples I saw all require an equation.
Is there anyway I can optimize with XGBTree model? If so, how can I implement such method? The code I used to train XGBTree is as follows.
Thank you.
xgb_grid<-expand.grid(
nrounds = 500,
max_depth = 5,
eta = c(0.04, 0.05, 0.06, 0.07, 0.08, 0.09, 0.1, 0.11, 0.12),
gamma = 0.3,
colsample_bytree = 0.25,
min_child_weight = 2,
subsample = 0.5
)
xgb <- train(y ~ ., # model specification
data = train, # train set used to build model
method = "xgbTree", # type of model you want to build
trControl = ctrl, # how you want to learn
tuneGrid = xgb_grid, # tune grid
metric = "ROC", # performance measure
verbose = TRUE
)
Some real examples how it can be achieved.