Correction of Regression Predictions Using the Secondary Learner on the Sensitivity Analysis Outputs
Keywords:
Regression, predictions, correction of predictions, sensitivity analysis, prediction error, prediction accuracyAbstract
For a given regression model, each individual prediction may be more or less accurate. The average accuracy of the system cannot provide the error estimate for a single particular prediction, which could be used to correct the prediction to a more accurate value. We propose a method for correction of the regression predictions that is based on the sensitivity analysis approach. Using predictions, gained in sensitivity analysis procedure, we build a secondary regression predictor whose task is to predict the signed error of the prediction which was made using the original regression model. We test the proposed methodology using four regression models: locally weighted regression, linear regression, regression trees and neural networks. The results of our experiments indicate significant increase of prediction accuracy in more than 20% of experiments. The favorable results prevale especially with the regression trees and neural networks, where locally weighted regression was used as a model for predicting the prediction error. In these experiments the prediction accuracy increased in 60% of experiments with regression trees and in 50% of experiments with neural networks, while the increase of the prediction error did not occur in any experiment.Downloads
Download data is not yet available.
Downloads
Published
2012-01-26
How to Cite
Bosnić, Z., & Kononenko, I. (2012). Correction of Regression Predictions Using the Secondary Learner on the Sensitivity Analysis Outputs. Computing and Informatics, 29(6), 929–946. Retrieved from http://147.213.75.17/ojs/index.php/cai/article/view/119
Issue
Section
Articles