0

I am using gradient boosting for a multinomial classification problem. I have a warning message after I run my code. This is one example of documentation.

data(iris)
iris.mod <- gbm::gbm(Species ~ ., distribution="multinomial", data=iris,
                n.trees=2000, shrinkage=0.01, cv.folds=5,
                verbose=FALSE, n.cores=1)
Warning message:
Setting `distribution = "multinomial"` is ill-advised as it is currently broken. It exists only for backwards compatibility. Use at your own risk.

My question is using gbm for multinomial classification is inappropriate given that I get this warning message?

Pastor Soto
  • 336
  • 2
  • 14
  • Sounds like it is. Could you use a tree booster in `xgboost` instead? – DaveArmstrong Mar 08 '21 at 22:09
  • 1
    GBM is appropriate for multi-class but the current implementation for mulit-class in gbm package is broken. You can try to use caret (https://cran.r-project.org/web/packages/caret/vignettes/caret.html). – nithish08 Mar 08 '21 at 22:09
  • Thank you! Yes, I could use any other package, I was trying to replicate the tuning strategy of this book but using a classification problem. Can I change the learning rate, number of trees, and interaction depth on caret? https://bradleyboehmke.github.io/HOML/gbm.html – Pastor Soto Mar 08 '21 at 22:42
  • I found the solution with the Caret package. Only need to set the tuneGrid with the desire parameters. All the information can be found here. Thanks! https://topepo.github.io/caret/model-training-and-tuning.html#grids – Pastor Soto Mar 08 '21 at 23:14

0 Answers0