0

Can i use linear regression like gradient boosting technique does
For m=1 to M ( no of linear regressions) :

(1) - Fit the model to the data

(2) - Predict the Values and find the residuals

(3) - Replace the dependent variable data with residuals

Output : ill sum the output of all these linear regressions .

My Question here is will it work , if yes/no please kindly justify your answer .

Thanks in Advance :)

Daya
  • 35
  • 3
  • No, because in the linear model, the residuals are by construction uncorrelated with X. So, you would not find an interesting relationship between the residuals and the independent variables in the original model. – DaveArmstrong Sep 13 '20 at 11:51
  • It looks much more like a closed-loop control system in the Modern Control Theory courses where you input the error to the system in a loop and try to reduce it. – Onur Baştürk Sep 13 '20 at 12:17
  • @DaveArmstrong Thanks for responding :) , if you are saying we wont find much useful information using second regression , then how come it works gradient boosting , in Gboost we try to build a decision tree with target data as residuals , Howcome decision tree works well on residuals . If you dont mind can you give me some insights about how gradient boost works well by using residuals ( my doubt here is ,we are completely using different target value , how creating the decision tree on residual helps us ) – Daya Sep 13 '20 at 12:47
  • @Daya a decision tree doesn't have the same structure as the linear model. In a linear model, the residuals are perfectly uncorrelated with the design matrix of the model by construction. That is, this is a property of the OLS estimator. Decision trees are a completely different framework because of the splitting and feature selection that are part of the model. – DaveArmstrong Sep 13 '20 at 12:58

0 Answers0