0

I'am using decision tree to predict future behavior of my dataset.It contains decision variable called "rate" That I want to predict.I have many characteristics that influences on the rate column but when I apply decision tree algorithm. I gave only one level which is ibt as mentioned on the code below:

ad.apprentissage= rpart(rate~vqs+ibt+tbt+bf+n, data=filteredDataFinal)

node), split, n, loss, yval, (yprob)
      * denotes terminal node

1) root 27 15 4 (0.4074074 0.4444444 0.1481481)  
  2) ibt< 1.516 11  3 3 (0.7272727 0.2727273 0.0000000) *
  3) ibt>=1.516 16  7 4 (0.1875000 0.5625000 0.2500000) *

Now, I'm asking on how to add other level to the tree like tbt characteristics.

Manel Chaabene
  • 187
  • 1
  • 3
  • 14

1 Answers1

1

Maybe I'm missing your question, but tree size in rpart is controlled by the complexity parameter (cp). You can try different values to get a different sized tree.

ad.apprentissage= rpart(rate~vqs+ibt+tbt+bf+n, data=filteredDataFinal, cp=0.1)
teadotjay
  • 1,395
  • 12
  • 15
  • Thanks for your help but it minsplit who it is responsible for tree length – Manel Chaabene Jun 17 '16 at 08:35
  • Right, minsplit will also influence the number of splits, but in general the number of splits increases with decreasing cp. (BTW, I just realized the default cp is 0.01, so you'll need to go lower than this to see more splits in your tree). – teadotjay Jun 17 '16 at 12:22