I would like to combine the result of some predictive models by weighting their results. In the most related works the weights of models are in the range of (0,1),But I have found that negative weights result in better prediction.So, at first, I set the lower bound to -3 and upper bound to 7. In additionو I have to determine a constraint on the sum of the weights of the models and the best result has been achieved by the constraint of 1. Does it make sense to determine the weights like what I mentioned?
Asked
Active
Viewed 105 times
1
-
If you have an ensemble of simple classifiers like decisionstumps some of them might have an accuracy below 50% so a negative weight could make sense. If it makes sense to set your bounds like you said is difficult to tell without and i guess even with seeing your data. Edit: Under 50% if you have a binary classification task – Florian H Sep 28 '17 at 10:59
-
I have 3 predictive models with 20, 25 and 18% error percentage receptively. output is positive-real number and for each model I have 8-9 features also in range of positive real numbers. Combining the results of models, the error percentage falls down to 5%. Is there any determined rule for combining models? – Mahboub Far Oct 02 '17 at 16:54